Cookies are small pieces of data stored on a user's computer by websites they visit. They are used to remember information about the user or their preferences on a particular site.
When using the Python Requests library for web scraping or automation, you may need to persist cookies between sessions. For example, you may want to save cookies after logging into a site and load them again later to remain logged in. In this comprehensive guide, I'll explain how to save and load cookies when using Requests in Python.
What Are Cookies and How Do They Work?
Cookies are small text files stored in a user's browser and sent along with requests made to websites. They allow sites to identify users and store preferences, items in shopping carts, session data, and more.
Each cookie has a name, value, expiration date, path, domain, and other metadata. When a server sends a Set-Cookie header in response to a request, the browser stores the cookie and sends it back on subsequent requests to the same domain.
For example, when you log into a site, it may set a cookie containing your user ID or authentication token. On future requests, this cookie gets sent along so you remain logged in as you navigate the site.
Cookies enable statefulness and customization across requests. Without them, HTTP communication would be stateless – each request independent with no memory or link to previous interactions.
Why Save and Load Cookies with Python Requests?
The Requests module makes it very easy to make HTTP calls in Python. But by default, it doesn't persist cookies between sessions. Once you close your script, all cookie data is lost. This means each new Requests session starts fresh with no cookies.
Saving and loading cookies becomes important when:
- You want to maintain a persistent session across executions of your program.
- You need to pause a long-running scraper or bot, and resume from where you left off.
- You want to speed up development by reusing cookie sessions.
- You want to isolate sessions and store different cookie sets.
Some example use cases:
- Log in once – Log into a site, save cookies, then load them to remain logged in for future requests. Avoid re-logging in each time.
- Resume scraping – Pause a long scrape by saving cookies, shut down, then load cookies and continue where you left off.
- Shared sessions – Export and import cookies between different scripts to share logged-in sessions.
- Bot development – Quickly recreate previous sessions by reloading saved cookie states.
The Requests module provides two handy utility functions for serializing and deserializing cookies – dict_from_cookiejar()
and cookiejar_from_dict()
. Next, I'll explain how these work, along with examples for saving and loading cookies to file.
Serializing Cookies to JSON with dict_from_cookiejar()
The RequestsCookieJar
object stores cookies for a Session. But this cookiejar format isn't directly serializable to JSON or other string formats. We need to convert between cookiejar and dictionaries to encode cookies in JSON or other serializable structures.
The dict_from_cookiejar()
function handles this cookiejar serialization for us:
cookies = requests.utils.dict_from_cookiejar(session.cookies)
It takes a Requests CookieJar
and returns a dictionary containing all the cookie metadata:
{ "example.com" : { "session_id" : { "value" : "12345", "path" : "/", "domain" : "example.com", "expires" : None, "discard" : True, ... } } }
Now this dictionary can be serialized to JSON for storage or transfer:
import json json_cookies = json.dumps(cookies)
The dictionary structure maps:
- Top level: Domains
- Second level: Cookie names
- Innermost values: Cookie metadata (value, path, expiry, etc)
Converting cookiejars to Python dictionaries gives us an easy way to serialize cookies into JSON strings.
Loading Cookies from JSON with cookiejar_from_dict()
Once we've serialized cookies into JSON, we can later load them back into a CookieJar using the inverse function cookiejar_from_dict()
. It accepts a dictionary in the format generated by dict_from_cookiejar()
and returns a CookieJar instance populated with the cookies.
For example, after loading a JSON string created earlier:
import json json_cookies = # load from file cookies_dict = json.loads(json_cookies) cookiejar = requests.utils.cookiejar_from_dict(cookies_dict)
Now cookiejar
contains Cookie objects matching the deserialized JSON data. To use these cookies, load the jar into a Session:
session = requests.Session() session.cookies = cookiejar
Now session
will send these cookies in its requests. This allows you to persist cookies across sessions and resume exactly where you left off.
Saving and Loading Cookies to File
A common way to save cookies between executions is to serialize them to file on disk. Typically JSON format. Here is a full example saving cookies to a JSON file:
import requests import json session = requests.Session() # Log in session.post("https://website.com/login", data={"username":"myuser", "password":"mypass"}) # Access some pages session.get("https://website.com/page1") session.get("https://website.com/page2") # Save cookies to file as JSON cookies = requests.utils.dict_from_cookiejar(session.cookies) with open("cookies.json", "w") as f: json.dump(cookies, f)
We perform some requests to log in and access pages, setting up some cookie state. The cookies are extracted and serialized to a JSON file cookies.json. In another script or after restarting, we can load these cookies again:
import requests import json with open("cookies.json") as f: cookies = json.load(f) cookiejar = requests.utils.cookiejar_from_dict(cookies) session = requests.Session() session.cookies = cookiejar # Continue on with session exactly where we left off!
This loads cookies from the JSON file back into a CookieJar, which we attach to a new Requests Session. Now this new session picks up right where the previous one ended, reusing the authenticated state and any other cookie data.
Full Examples Saving and Loading Cookies
Below are some complete examples demonstrating usage of saving and loading cookies with the Requests library in Python.
Log In Once and Reuse Session
This illustrates logging into a site, saving cookies, and loading them in a new script to avoid re-logging in:
# script1.py import requests import json session = requests.Session() # Log in and access some pages resp = session.post("https://website.com/login", data={"username": "myuser", "password":"mypass"}) session.get("https://website.com/page1") # Save cookies to file with open("cookies.json", "w") as f: json.dump(requests.utils.dict_from_cookiejar(session.cookies), f) # script2.py import requests import json with open("cookies.json") as f: cookies = json.loads(f.read()) session = requests.Session() session.cookies = requests.utils.cookiejar_from_dict(cookies) # Continue on with session without re-logging in! session.get("https://website.com/page2")
This avoids having to log in each time you run a script. Just save cookies once logged in, and load them to continue an authenticated session later.
Resume Scraping After Interruption
Here we scrape some pages, pause and save cookies, then resume scraping where we left off:
# script1.py import requests import json session = requests.Session() # Start scraping for page in range(1, 100): data = session.get(f"https://website.com/data?page={page}").json() # process data # Stop and save cookies with open("cookies.json", "w") as f: json.dump(requests.utils.dict_from_cookiejar(session.cookies), f) # script2.py import requests import json with open("cookies.json") as f: cookies = json.loads(f.read()) session = requests.Session() session.cookies = requests.utils.cookiejar_from_dict(cookies) # Resume scraping where we left off for page in range(100, 200): data = session.get(f"https://website.com/data?page={page}").json() # process data
This lets you break up long scrapes across multiple executions, saving state so you can pick up where you left off.
Share Cookie Session Between Scripts
Export and import cookie sessions for re-use across different scripts:
# script1.py import requests import json session = requests.Session() # Do some scraping and processing # Log in, access data, etc ... # Export cookie session cookies = requests.utils.dict_from_cookiejar(session.cookies) json_cookies = json.dumps(cookies) with open("shared_cookies.json", "w") as f: f.write(json_cookies) # script2.py import requests import json with open("shared_cookies.json") as f: cookies = json.loads(f.read()) session = requests.Session() session.cookies = requests.utils.cookiejar_from_dict(cookies) # Continue on reusing authenticated session session.get("https://website.com/api")
This allows multiple scripts to share cookie sessions without having to duplicate the same login logic.
Quickly Recreate Development Sessions
During development and testing, reload saved cookie files to recreate sessions quickly:
# normal_flow.py import requests # Do all the usual stuff... # Login, access pages, data, etc... # Save session for later with open("dev_cookies.json", "w") as f: json.dump(requests.utils.dict_from_cookiejar(session.cookies), f) # test_module.py import requests import json # Load saved session with open("dev_cookies.json") as f: cookies = json.loads(f.read()) session = requests.Session() session.cookies = requests.utils.cookiejar_from_dict(cookies) # Test module using realistic session result = session.get("/api")
This avoids constantly repeating the same login and setup steps before testing a module. Just save sessions from normal usage then quickly reload them as needed.
Conclusion
Here's a quick summary of what we covered:
- Cookies allow HTTP servers to identify clients and store session data. They are sent with requests once set.
- Saving and loading cookies in Python Requests lets you maintain persistent sessions and pause/resume scripts.
**dict_from_cookiejar()**
serializes a cookiejar to a JSON-serializable dictionary.**cookiejar_from_dict()**
loads a cookie dictionary back into a Requests cookiejar.- Cookies can be saved to JSON files for storage between executions.
- Full examples were provided for reusing sessions, resuming scraping, sharing cookiejars between scripts, and speeding up development.
Learning to use cookies properly is an important skill for web scraping and automation in Python. Following the examples here will help you persist and reuse cookie sessions with the Requests library.