Current Unix timestamp
—
seconds since epoch
—
Timestamp → Date
Date → Timestamp
Unix (seconds)
—
Unix (milliseconds)
—
What is a Unix timestamp?
A Unix timestamp is the number of seconds that have elapsed since 00:00:00 UTC on 1 January 1970 — the "Unix epoch". It is used universally in programming to represent points in time as a single integer, making date arithmetic simple and timezone-independent.
Millisecond timestamps are also common in JavaScript and modern APIs — these are simply the Unix timestamp multiplied by 1000. This converter supports both formats automatically.
Frequently asked questions
What is the Unix epoch?
The Unix epoch is 00:00:00 UTC on January 1, 1970. All Unix timestamps measure seconds elapsed since this moment. It was chosen as a convenient reference point when Unix was being developed.
What is the difference between seconds and milliseconds?
Unix timestamps in seconds are 10 digits (e.g. 1715000000). Millisecond timestamps are 13 digits (e.g. 1715000000000). JavaScript's Date.now() returns milliseconds. Most Unix/Linux systems use seconds.
What is the Year 2038 problem?
32-bit systems store Unix timestamps as signed integers, which overflow on January 19, 2038. 64-bit systems are not affected and can represent dates billions of years into the future.
How do I get the current Unix timestamp in code?
JavaScript: Math.floor(Date.now()/1000) — Python: import time; int(time.time()) — PHP: time() — Unix shell: date +%s
Get current timestamp
JavaScript: Date.now()
Python: int(time.time())
PHP: time()
Shell: date +%s
SQL: UNIX_TIMESTAMP()
Common timestamps
Unix epoch: 0
Y2K: 946684800
2038 problem: 2147483647
JS max safe: 8640000000000
Related tools