Sleep Trackers in 2017

Data Extraction Guide

This guide is an informal summary of how I retrieved data from the various devices used in my sleep tracking study. If you have any questions or suggestions, feel free to email me at! I would also love to hear if any of the information below proves useful for you in future projects.


The Fitbit similarly provides an API for developers. You can find documentation for Get Sleep Logs here, which includes minute-by-minute data. I personally had trouble with the authentication when trying to use this API, so I was able to use a graphic API explorer on Apigee to get the sleep details for each sleep. I believe you can do something similar using Postman as well. Here’s what I did on Apigee:

  1. Click on the Authentication drop-down at the top and select OAuth 2. Log in with your Fitbit account and authorize access.
  2. Go to the SLEEP section of requests on the left, and then select “Get Sleep”.
  3. Navigate to the Template* tab and modify the date parameter of your request to the night you’re looking for.
  4. Send your request, and it will return minute summaries for that specific night.


Jawbone provides an API for developers (sleep endpoints are here -- they sometimes call the stages “ticks” or “phases”), but I wasn’t really sure how to get started with authentication for this as well. I used this tutorial that walks you through using the NPM version of the Jawbone API with Node.js as my main frame of reference, and tweaked the code a little bit to get what I wanted.

** According to Jawbone API dev docs, the "depth" output values correspond as follows: “1=awake, 2=light, 3=deep”. The sleep phases, or “ticks”, are formatted by chunks of time categorized by these stages, rather than minute-by-minute.

Microsoft Band

We found that the MS Health Dashboard website displays sleep charts visually generated through HTML/CSS. Thanks to Jing Qian, we were able to find the request URL that gets the sleep data for a specific night following the steps below.

  1. Log in to your Microsoft Health Dashboard at
  2. Input this URL in your browser, adjusting the date parameter to the night of your choice:" **may need to change utcOffsetMinutes based on your timezone.

The output text on this page contains a ton of data, including the sleep stages of each chunk of time as “SleepType” and “SequenceType”.

We found another data sandbook tool for Microsoft Band that works for other types of minute summaries (including heart rate, temperature, etc.) but does NOT include sleep stage data). If you'd like to check that out, take a look: here's the Reddit post we found it on, and here’s the link to the Github repo.

Hello Sense

Hello Sense’s support team told me that their data export feature is not ready for public use yet, but Jeff Huang was able to figure out how to get the data using Postman. Thanks Jeff!

  1. Download the Mac app called Postman for sending HTTP POST requests. It’s free!
  2. Then in the top where you enter in a URL, change the option to POST, and put in:
  3. Click on Body and select “x-www-form-urlencoded” since it’s an OAuth2 request
  4. Send it, and you should get a Bearer token back. You put that in the header by clicking on Headers, then put “Authorization” in the key, and “Bearer 2.98877beb90394f2fb23db6974e68a6bf” (replace my token with yours) in the value. Then you should be able to issue more interesting commands! For example, to get the sleep timeline data for December 12, 2016, it’s a GET request for

While this does work, the data retrieved from this request is not quite polished. Many intervals have 1 to 5 minute gaps between each other, and it is not consistent with the graphs shown on their mobile app. I’m guessing this is why they haven’t released the feature to the public yet.

For our study, I was unable to generate the same data that seems to be displayed on the Hello Sense app. The app displays visualizations with intervals that seem quite random, with timelines visually stretched without clear patterns. For our graph, I took the data from Postman and manually edited to best fit what appears on the Sense app.

AMI MotionLogger

The software for the AMI MotionLogger is only available if you own an actigraph or work with a lab that owns these research-standard actigraphs. There are many options for data extraction and formatting featured in the user manual PDF, but to retrieve the data that we used (minute-by-minute sleep vs. awake), follow the steps below.

  1. Once you’ve extracted the data from the actigraph to .ami format at your lab, open the .ami file in ActionW 2.7. You do not need to change the Channel settings.
  2. If you would like to choose a different sleep scoring algorithm, you may choose it in Options -> Sleep. For this project, we went by the default Cole-Kripke algorithm.
  3. Go to Options -> Epoch-by-Epoch and change the settings: Check Date, Time, Sleep Score; Set delimeters to Commas (or whatever you would like); Click OK
  4. Go to File -> Save As… and change File type to Epoch By Epoch (.ebe) file.
  5. The output .ebe file should look like this in a text editor. 0 corresponds to Awake, 1 corresponds to Asleep: Date,Time,Sleep; 12/05/16,13:37:00,0; 12/05/16,13:38:00,0...

Sleep Cycle

After a lot of back-and-forth emailing and miscommunication from the Sleep Cycle team (“Northcube AB”), I found that you can download minute summaries only if you have Sleep Cycle premium on your account on their web app here. It looks like this:

I asked them what each of these values in the JSON meant, and their reply was: “The events numbers are (in order): [number of second since night started], [event type], [intensity]. Every event type that is 1 or 3 are movements. All other are for internal processes.”

The first values are not always in regular intervals, nearly all the second values are 3, and the third values are rather unclear. I cannot tell what the threshold for “Deep sleep”, “Sleep”, and “Awake” are. My best guesses were at about 0 < for Deep sleep, x<y< for Sleep, and > for awake. Northcube AB did not reply to my inquiries about how to interpret these values and claims to use machine learning algorithms.

For my visualization, I had to use my best estimates from this data, and then adjust the values manually to match the given images from the app. These are still very rough approximations at best since it is unclear even visually where to mark divisions between stages, as you can see below.

Sleep Coacher