Year 2038 problem

Enlarge picture
Example showing how the date would reset (at 03:14:08 UTC on 19 January 2038).
The year 2038 problem may cause some computer software to fail before or in the year 2038. The problem affects programs that use the POSIX time representation, which represents system time as the number of seconds (ignoring leap seconds) since January 1, 1970. This representation is standard in Unix-like operating systems and also affects software written for most other operating systems because of the broad deployment of C. On most 32-bit systems, the time_t data type used to store this second count is a signed 32-bit integer. The latest time that can be represented in this format, following the POSIX standard, is 03:14:07 UTC on Tuesday, January 19, 2038. Times beyond this moment will "wrap around" and be represented internally as a negative number, and cause programs to fail, since they will see these times not as being in 2038 but rather in 1901. Erroneous calculations and decisions may therefore result.

"Year 2038" is frequently abbreviated to "Y2038", "Y2K38".[1]

Known problems

In May 2006, reports surfaced of an early Y2038 problem in the AOLserver software. The software would specify that a database request should "never" time out by specifying a timeout date one billion seconds in the future. One billion seconds (just over 31 years 251 days and 12 hours) after 21:27:28 on 12 May 2006 is beyond the 2038 cutoff date, so after this date, the timeout calculation overflowed and calculated a timeout date that was actually in the past, causing the software to crash.[2][3]

Solutions

There is no easy fix for this problem for existing CPU/OS combinations. Changing the definition of time_t to use a 64-bit type would break binary compatibility for software, data storage, and generally anything dealing with the binary representation of time. Changing time_t to an unsigned 32-bit integer, effectively allowing timestamps to be accurate until the year 2106, would affect many programs that deal with time differences.

Most operating systems for 64-bit architectures already use 64-bit integers in their time_t. The move to these architectures is already underway and many expect it to be complete before 2038. Using a (signed) 64-bit value introduces a new wraparound date in about 290 billion years. However, as of 2007, hundreds of millions of 32-bit systems are deployed, many in embedded systems, and it is far from certain they will all be replaced by 2038. Despite the modern 18- to 24-month generational update in computer systems technology, embedded computers may operate unchanged for the life of the system they control. The use of 32-bit time_t has also been encoded into some file formats, which means it can live on for a long time beyond the life of the machines involved.

A variety of alternative proposals have been made, some of which are in use, including storing either milliseconds or microseconds since an epoch (typically either January 1, 1970 or January 1, 2000) in a signed-64 bit integer, providing a minimum of 300,000 years range.[4][5] Other proposals for new time representations provide different precisions, ranges, and sizes (almost always wider than 32 bits), as well as solving other related problems, such as the handling of leap seconds.

See also

References

1. ^ The Year-2038 Bug. Retrieved on 2007-07-13.
2. ^ The Future Lies Ahead (2006-06-28). Retrieved on 2006-11-19.
3. ^ Shiobara, Dossy (2006-05-17). Something wrong after 2006-05-12 21:25. Retrieved on 2006-11-19.
4. ^ Unununium Time. Archived from the original on 2006-08-04. Retrieved on 2006-11-19.
5. ^ Sun Microsystems. Java API documentation: System.currentTimeMillis.

External links

Computer software is a general term used to describe a collection of computer programs, procedures and documentation that perform some task on a computer system. [1]
..... Click the link for more information.
This article or section contains information about scheduled or expected future events.
It may contain tentative information; the content may change as the event approaches and more information becomes available.
..... Click the link for more information.
Unix time, or POSIX time, is a system for describing points in time: it is the number of seconds elapsed since midnight UTC of January 1 1970, not counting leap seconds. It is widely used not only on Unix-like operating systems but also in many other computing systems.
..... Click the link for more information.
In computer science and computer programming, system time represents a computer system's notion of the passing of time. In this sense, time also includes the passing of days on the calendar.
..... Click the link for more information.
leap second is an intercalary, one-second adjustment that keeps broadcast standards for time of day close to mean solar time. Broadcast standards for civil time are based on Coordinated Universal Time (UTC), a time standard which is maintained using extremely precise atomic clocks.
..... Click the link for more information.
January 1 is the 1st day of the year (2nd in leap years) in the Gregorian calendar. There are 0 days remaining. The preceding day is December 31 of the previous year.
..... Click the link for more information.
19th century - 20th century - 21st century
1940s  1950s  1960s  - 1970s -  1980s  1990s  2000s
1967 1968 1969 - 1970 - 1971 1972 1973

Year 1970 (MCMLXX
..... Click the link for more information.
Unix (officially trademarked as UNIX®) is a computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs including Ken Thompson, Dennis Ritchie and Douglas McIlroy.
..... Click the link for more information.
C

The C Programming Language, Brian Kernighan and Dennis Ritchie, the original edition that served for many years as an informal specification of the language.
..... Click the link for more information.
In computing, signedness is a property of variables representing numbers in computer programs. A numeric variable is signed if it can represent both positive and negative numbers, and unsigned if it can only represent positive numbers.
..... Click the link for more information.
In computer architecture, 32-bit integers, memory addresses, or other data units are those that are at most 32 bits (4 octets) wide. Also, 32-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size.
..... Click the link for more information.
In computer science, the term integer is used to refer to any data type which represents some subset of the mathematical integers. These are also known as integral data types.
..... Click the link for more information.
Coordinated Universal Time (UTC) is a high-precision atomic time standard. UTC has uniform seconds defined by International Atomic Time (TAI), with leap seconds announced at irregular intervals to compensate for the earth's slowing rotation and other discrepancies.
..... Click the link for more information.
January 19 is the 1st day of the year (2nd in leap years) in the Gregorian calendar. There are 0 days remaining.

Events


..... Click the link for more information.
This article or section contains information about scheduled or expected future events.
It may contain tentative information; the content may change as the event approaches and more information becomes available.
..... Click the link for more information.
AOLserver is America Online's open source web server. AOLserver is multithreaded, Tcl-enabled, and used for large scale, dynamic web sites.

AOLserver is distributed under the AOLserver Public License, which is similar to the Mozilla Public License.
..... Click the link for more information.
May 12 is the 1st day of the year (2nd in leap years) in the Gregorian calendar. There are 0 days remaining.

Events

  • 1191 - Richard I of England marries Berengaria of Navarre.

..... Click the link for more information.
20th century - 21st century - 22nd century
1970s  1980s  1990s  - 2000s -  2010s  2020s  2030s
2003 2004 2005 - 2006 - 2007 2008 2009

2006 by topic:
News by month
Jan - Feb - Mar - Apr - May - Jun
..... Click the link for more information.
central processing unit (CPU), or sometimes simply processor, is the component in a digital computer capable of executing a program.(Knott 1974) It interprets computer program instructions and processes data.
..... Click the link for more information.
An operating system (OS) is the software that manages the sharing of the resources of a computer. An operating system processes system data and user input, and responds by allocating and managing tasks and internal system resources as a service to users and programs of the
..... Click the link for more information.
In computer architecture, 64-bit integers, memory addresses, or other data units are those that are at most 64 bits (8 bytes) wide. Also, 64-bit CPU and ALU architectures are those that are based on registers, address buses, or data buses of that size.
..... Click the link for more information.
Hardware is an expression used within the engineering disciplines to explicitly distinguish the (electronic computer) hardware from the software which runs in it. But hardware,
..... Click the link for more information.
An embedded system is a special-purpose computer system designed to perform one or a few dedicated functions.[1] It is usually embedded as part of a complete device including hardware and mechanical parts.
..... Click the link for more information.
January 1 is the 1st day of the year (2nd in leap years) in the Gregorian calendar. There are 0 days remaining. The preceding day is December 31 of the previous year.
..... Click the link for more information.
19th century - 20th century - 21st century
1940s  1950s  1960s  - 1970s -  1980s  1990s  2000s
1967 1968 1969 - 1970 - 1971 1972 1973

Year 1970 (MCMLXX
..... Click the link for more information.
January 1 is the 1st day of the year (2nd in leap years) in the Gregorian calendar. There are 0 days remaining. The preceding day is December 31 of the previous year.
..... Click the link for more information.
20th century - 21st century
1970s  1980s  1990s  - 2000s -  2010s  2020s  2030s
1997 1998 1999 - 2000 - 2001 2002 2003

2000 by topic:
News by month
Jan - Feb - Mar - Apr - May - Jun
..... Click the link for more information.
leap second is an intercalary, one-second adjustment that keeps broadcast standards for time of day close to mean solar time. Broadcast standards for civil time are based on Coordinated Universal Time (UTC), a time standard which is maintained using extremely precise atomic clocks.
..... Click the link for more information.
Year 2000 problem (also known as the Y2K problem, the millennium bug or the Y2K Bug) was the result of a practice in early computer program design that caused some date-related processing to operate incorrectly for dates and times on and after January 1, 2000.
..... Click the link for more information.
The year 10,000 problem or Y10K is the collective name for all potential software bugs that may emerge as the need to express years with five digits arises. References to any year 10,000 or Y10K problem cannot at face value be taken to be serious or in jest as there are both
..... Click the link for more information.


This article is copied from an article on Wikipedia.org - the free encyclopedia created and edited by online user community. The text was not checked or edited by anyone on our staff. Although the vast majority of the wikipedia encyclopedia articles provide accurate and timely information please do not assume the accuracy of any particular article. This article is distributed under the terms of GNU Free Documentation License.