Unix Timestamp Converter
Current timestamp (seconds):
Timestamp → Date
Date → Timestamp
Interval Calculator
What is a Unix Timestamp?
The Unix timestamp (or Unix time) is the number of seconds elapsed since January 1, 1970, 00:00:00 UTC, not counting leap seconds. It is a universal format used in computer systems to store and compare dates easily, independent of time zones.
Why use timestamps?
- Database storage: an integer takes less space than a formatted date.
- Calculations and comparisons: easy duration calculations and sorting.
- APIs and logs: standard for event timestamping.
- Time zone independent: always UTC.
Seconds or milliseconds?
The classic Unix timestamp is in seconds. However, many languages (JavaScript, Java) use milliseconds. Our tool automatically handles both: if you enter a number ≥ 1000000000000, it is interpreted as milliseconds.
The Year 2038 problem
On 32-bit systems, the Unix timestamp is stored in a signed 32-bit integer, whose maximum value is 2,147,483,647 (January 19, 2038, at 03:14:07 UTC). After this date, an overflow may occur. Modern systems use 64-bit integers, pushing the problem far into the future.
Frequently Asked Questions
UTC (Coordinated Universal Time) is the modern successor to GMT. For everyday use, they are equivalent. UTC does not change with seasons, unlike local times.
Excel uses January 1, 1900 as the reference. The formula is: = (timestamp / 86400) + 25569 (for seconds).
No, the Unix timestamp ignores leap seconds. Each day is considered to have exactly 86,400 seconds.