Unix Timestamp Converter
| Event | Unix Timestamp (s) | Date |
|---|---|---|
| Unix Epoch Start | 0 | January 1, 1970 00:00:00 UTC |
| Y2K | 946684800 | January 1, 2000 00:00:00 UTC |
| Today (start of day) | — | — |
| Y2K38 Problem | 2147483647 | January 19, 2038 03:14:07 UTC |
| Year 2100 | 4102444800 | January 1, 2100 00:00:00 UTC |
What is a Unix Timestamp?
A Unix timestamp is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC — a reference point known as the Unix Epoch. Because it is timezone-independent, Unix timestamps are ideal for storing and comparing times across different systems and regions. A timestamp of 0 represents exactly the epoch start, and negative values represent times before 1970.
Seconds vs Milliseconds
Traditional Unix timestamps count elapsed seconds and are typically 10 digits long (e.g. 1700000000). JavaScript's Date.now() and many modern APIs return milliseconds instead, producing 13-digit values (e.g. 1700000000000). To convert seconds to milliseconds, multiply by 1000. To convert milliseconds to seconds, divide by 1000 and floor the result.
Unix Timestamp vs ISO 8601
Unix timestamps are compact integers, making them efficient for storage, sorting, and arithmetic. ISO 8601 (e.g. 2024-03-15T09:30:00.000Z) is a human-readable string format that's self-describing and unambiguous. Use timestamps internally in databases and APIs; use ISO 8601 in logs, user interfaces, and anywhere a human might read the value.