Convert Unix timestamps to readable dates and vice versa. Enter a timestamp or date and the result appears instantly.

Unix timestamp (epoch time) is the number of seconds since January 1, 1970, 00:00:00 UTC. It is the standard time representation in programming, databases, APIs, and log files.
Example: 1700000000 = November 14, 2023, 22:13:20 UTC. The current timestamp increases by 1 every second.
Converting timestamps is essential for developers working with APIs, debugging log files, analyzing database records, and understanding time-based data.
All calculations run locally in your browser — nothing is sent to any server.
Unix time counts seconds from the epoch: January 1, 1970, 00:00:00 UTC. This date was chosen as a convenient reference point for the original Unix operating system.
Timestamps can be in seconds (10 digits: 1700000000) or milliseconds (13 digits: 1700000000000). JavaScript Date uses milliseconds.
The Year 2038 problem: 32-bit signed integers overflow on January 19, 2038. Most modern systems use 64-bit timestamps to avoid this.
| Feature | Unix (seconds) | Unix (milliseconds) |
|---|---|---|
| Digits | 10 digits | 13 digits |
| Example | 1700000000 | 1700000000000 |
| Used by | PHP, Python, MySQL | JavaScript, Java |
| Precision | 1 second | 1 millisecond |
| Max (32-bit) | Jan 19, 2038 | N/A (uses 64-bit) |

Have an idea, found a bug, or want to suggest a feature? Drop us a message – we respond within 24 hours.