Convert Chrome/WebKit timestamps to human-readable date
The WebKit Precision Time or High-Resolution Time timestamp format is used by Google Chrome (Chromium)
in its base::Time class.
It is a 64-bit signed integer (int64_t) representing microseconds since January 1, 1601 00:00 UTC.
One microsecond is one-millionth of a second.
Google Chrome and other Chromium browsers like Microsoft Edge, Brave and Opera use this timestamp format internally for various purposes, such as tracking history, cookies, cache, and performance metrics.
You can find these timestamps in Chrome's SQLite databases, such as History, Cookies, and Cache.
If you open the History file in a database viewer, the visit_time, last_visit_time fields
are WebKit timestamps. In the Cookies database, the creation_utc and expires_utc fields are WebKit timestamps.
The current WebKit timestamp is 13418075856000000
Enter a 17-digit Chrome/WebKit timestamp:
Modern WebKit (Safari)
The WebKit engine (used by Safari) has moved away from this timestamp. WebKit is now using the Unix epoch (January 1, 1970) for its internal time representation. Use the converter on the homepage for this format.
Programming routines
SQLite (for Chrome history)
SELECT datetime(visit_time / 1000000 + (strftime('%s', '1601-01-01')), 'unixepoch')
FROM visits;
Python
import datetime
def date_from_webkit(webkit_timestamp):
# 1. Define the Epoch as UTC
# Using datetime.UTC is the modern, non-deprecated way (Python 3.11+)
epoch_start = datetime.datetime(1601, 1, 1, tzinfo=datetime.timezone.utc)
# 2. Add the microseconds
delta = datetime.timedelta(microseconds=int(webkit_timestamp))
utc_time = epoch_start + delta
# 3. Print both UTC and Local time for clarity
print(f"UTC Time: {utc_time.strftime('%Y-%m-%d %H:%M:%S.%f')}")
print(f"Local Time: {utc_time.astimezone().strftime('%Y-%m-%d %H:%M:%S.%f')}")
# Usage
try:
inTime = input('Enter a WebKit timestamp: ').strip()
if inTime:
date_from_webkit(inTime)
except ValueError:
print("Please enter a valid numeric timestamp.")