Deep dive
Unix timestamps explained: seconds, milliseconds, and time zones
Epoch time for developers—pitfalls, offsets, and our timestamp converter tool.
Unix time counts seconds (or sometimes milliseconds) since the Unix epoch: 1970-01-01 00:00:00 UTC, excluding leap seconds in the most common API definitions. Developers encounter timestamps in JWT exp claims, log lines, database columns, and mobile app sync protocols. Human brains do not parse ten-digit integers instinctively, so conversion tools and disciplined time-zone handling are essential.
Seconds vs milliseconds
JavaScript’s Date.now() returns milliseconds, while many server frameworks default to seconds. Mixing them produces dates in 1970 or far-future years. When you ingest a number, compare its magnitude: values around 1e9–1e10 are likely seconds; values around 1e12–1e13 are likely milliseconds. Always document which unit your API emits.
Time zones and offsets
Unix timestamps are typically stored in UTC; presentation shifts with local offsets and daylight saving rules. Converting “wall clock” times across regions requires a timezone database—do not hard-code offsets for recurring meetings. For auditing, store UTC plus an explicit offset or zone id when legal requirements demand local civil time.
Leap seconds and leap days
Leap years affect calendar math; leap seconds affect smearing strategies in certain data centers. For most application code, trusting the platform clock and NTP synchronization is enough—until you build high-precision trading or scientific systems where specialists should be involved.
Convert with our tool
Paste values into the Timestamp Converter to bounce between epoch seconds, milliseconds, and localized calendar strings. Use JSON Formatter when inspect API payloads where dates appear as strings or numbers, and JWT Decoder to read exp fields quickly during debugging.