Documentos de Académico
Documentos de Profesional
Documentos de Cultura
Sitting Actions
Maulin Hemani, Sangrin Lee and Andrew McConnell
Abstract There is a significant association between time Additionally, the risk primarily increases for older adults as
spent on sedentary activities and the risk of chronic disease. they spend over 20% more time sitting each day and are
As a population, older adults are more likely to develop such more prone to develop chronic diseases.
chronic diseases; on a related note, they spend on average
over twenty percent more time per day sitting than younger To combat this, we aim to develop a model to detect both
adults. Given their detrimental effects on personal health, when one is sitting as well as the specific sitting context
sitting activities including watching TV, reading a book, and using passive sensor data. By identifying both of these, we
working at a desk pose important targets for intervention to hope to be able to break up this sedentary time based on the
break up the time spent on sedentary actions. Decreasing time context once it is detected that our subject is sitting. Splitting
spent sitting in older adults presents an opportunity for a great
advancement in personal health. up the sitting time by encouraging simple physical activity
With the overarching goal of breaking up sitting times, we as a break will ideally decrease the health risks people who
present in this paper a system to detect specific sitting activities. sit constantly are exposed to.
Using a combination of inertial and sound sensors, we have
developed a decision-tree based model which can differentiate II. RELATED WORKS
sedentary contexts for the purpose of identifying which are best
targets for interventional strategies. This system will be used In developing our system, we analyzed some previous re-
in real-time to understand why people are sitting, if they could search that tackled relevant issues. One similar study, which
and would be willing to take a break from sitting, and what looked to identify sedentary actions related to ours, used
they would do if they were to break their sitting time. Such their specially-designed K-Sense monitoring system which
an interventional, sensor-based mobile health device could have is based on inertial measurements units (IMUs), consisting
large ramifications in the healthcare industry.
To this extent, our system has been developed for use on the of accelerometer, gyroscope, and magnetometer. The system
Asus ZenWatch, which runs on the Android Wear operating was attached to the waist, wrist, and ankle. Using signal
system. We trialled our system on ten people to gather data with processing, it also checked the movements of various body
which to build the model. By monitoring the onboard sensors, parts by measuring kinetic motions. The result indicates that
our system has been able to differentiate among standing, inertial-based activity systems can be used to identify various
sitting, and lying down activities, and further contextualize
them into specific categories, with an accuracy of over ninety- nuanced activities accurately. In our study, we used a smart
seven percent. watch which consists of a 3-axis accelerometer, gyroscope,
gravity, and magnetometer to get the sensor data. Based on
I. INTRODUCTION the data, our study aimed to identify peoples movement and
Our rapidly evolving world presents many changes and provide users with more useful information [14].
challenges to people as they attempt to adjust. These changes Another study investigated the use of commercially avail-
are occurring in all aspects of our lives, including our able smart watches for use in activity detection, and found
lifestyles and personal health. One noticeable trend is the them to be more useful than similar smart phones for
ever decreasing need for constant physical activity through- detection of specialized hand-based activities (the paper
out the day. With new types of technology and jobs, our specifically notes the advantage in eating different kinds of
generation has been able to notably reduce the amount of food or drinking) [12].
physical activity in our daily lives. The other pertinent trend Yet another study focused on the association of type-
to this paper is the major increase in sitting time we see in specific and total time spent sitting with Framingham Score.
peoples daily lives nowadays as more of the activities in Framingham score was calculated using information on their
our lives now revolve around sitting. A simple illustration physical information such as age, blood pressure, and choles-
of this can be seen by imagining a typical day: driving to terol levels. Based on the score, sedentary time and context
destinations, working at a desk, eating, watching TV, and of subjects were reported by wearing Hookie accelerometers
browsing the internet on a laptop all involve sitting down at their waist. Sedentary time in sitting context was rec-
for example. Whether we realize it or not, sedentary time ognized from raw acceleration data based on low intensity
is surely taking up a larger role in our lives. Unfortunately, of movement using mean amplitude deviation and device
the overwhelming belief is that this increasing sitting time orientation in relation to identified upright position defined at
has related health risks and concerns we wish to study more. the end of each epoch. Based on this information, their sitting
Papers by Owen and Wennman showed that it may lead to contexts were identified. Using statistical method, the result
higher chances of chronic diseases such as cardiovascular shows that only sitting with watching television is related
diseases as well as higher risk of premature mortality [8][13]. to Framingham score. This finding led our study to identify
more specific relations between sitting activities and sitting Further advantages of the ZenWatch come from its use
time [13]. of Invensense sensors, which have been developed to in-
Another conducted study indicated that, by using a cluster clude virtual sensors that manipulate raw data to create
heat map, which is a graphical representation from ac- more contextual information, such as orientation and linear
celerometer data for each subjects, it is possible to assess acceleration (see Table 1). This essentially produces some
subjects sedentary time and activities. Based on the values automatic feature generation capability [6].
taken by the accelerometer counts, minutes since walking and Additionally, the watch has Bluetooth capabilities, which
activity intensity (counts/min) were represented as color in allow for pairing with a nearby Android device. This was
the two dimensional map. This map identified that the subject initially utilized for long-term data storage on a remote
was walking or sitting. This finding found that sitting time server, but this characteristic was later removed due to data
is defined as accelerometer counts below 100 per minute, loss and slow transmission on massive data samples.
and is related to the detrimental health status. The paper For testing, we developed an Android Wear app to allow
claims that the finding also includes activity transition from for sensor data to be collected and marked with the correct
sedentary status to non-sedentary status, and vice-versa [8]. activity being performed, which was used for initial model
One more study used Ecological Momentary Intervention validation.
(EMI), which provides a framework for treatment and detects TABLE I
eating habit, weight change, and other physical activity. S YSTEM S ENSORS [7]
By using a mobile phone, people can provide information
and receive real-time assessment based on EMI delivery, Sensors Description Type
frequency, and duration. Compared to EMI, our study more Accelerometer Acceleration of the de- Raw
vice along the 3 sensor
focuses on sensor based device. Our study uses sensor axes
based device which has more advantage in detecting exact Gravity Direction and magnitude Fusion
movement. Therefore, by analyzing these information, we of gravity in the devices
coordinates
believe we can make our system more valid and provide Gyroscope Rate of rotation of the Raw
people with more exact information [3]. device around the 3 sen-
In contrast with past studies, our system aims to use a sor axes
Magnetometer Magnitude and direction Raw
combination of smart watch-based sensors only, in order to of Earths magnetic flux
minimize the overall impact on ones normal life when using Linear Linear acceleration of Feature
the system. By containing all the necessary hardware on a Accelerometer the device in the sen-
sor frame, gravity vector
commercial watch, we encourage users to adhere to using the subtracted
system. To combat the loss of sensor information from using Rotation Vec- Orientation of the Fused
only one device, we also implemented software which allows tor device relative to
the watchs built-in microphone to act as a sensor for sound the East-North-Up
coordinates frame
amplitude, which adds another dimension to recorded data Step Detector Generates an event each Feature
for an activity sample. We believe this additional information time a step is taken by
will help differentiate the more nuanced activities by helping the user
Microphone Sound amplitude Feature
to identify the users environment (i.e. if the television is on). Orientation Pitch, roll and yaw based Fusion
Also, rather than calculating features in the frequency do- on inertial sensors
main, we analyzed the instantaneous rate of change between
two consecutive data samples to provide additional context
for the model. We believe that our systems conformation
A. Back End System
will prove to be a useful advancement in the field of Activity
Recognition (AR). Our back end system is primarily comprised of fore-
ground job and background job. When the app starts on the
III. SYSTEM foreground, it first checks local permissions (read/write to
external storage, record audio, modify audio settings, etc.).
Our primary source of data acquisition was through the If these are not permitted, the app requests all for the device
Asus ZenWatch (WI500Q). This specific device was chosen to run. Permissions are requested in the Android manifest,
due to the variety of sensors available in its hardware. This but if they are not allowed due to the Android API level, we
includes a three-axis accelerometer, a three-axis gyroscope, developed code to request permissions when the app starts.
a three-axis magnetometer, and a heart rate monitor, as well At the same time, all the sensor objects (Accelerometer, Gy-
as a built-in microphone, which we have developed code roscope, Gravity, etc.) and audio system object are initialized.
to convert into a sound amplitude sensor. The heart rate Once a user selects an activity, all sensors are registered and
monitor was not used, as it requires the user to place two the audio object starts to calculate sound amplitude data.
fingers on the watch screen (which reduces the passive nature When this begins, the sensors record samples at a rate of
of detection), and has been reported to have below-average 5Hz (which was chosen with consideration towards the cost
accuracy [9]. of sound amplitude calculation), and the data are stored in a
built-in SQLite database on the local SD card for the duration
of testing. When the stop button is pressed, each sensor is
unregistered and sensor events stop. After the user completes
all the activities and presses the replicate button, the app
converts all the entries in the SQLite database into CSV
files stored on the watch. Once it is done, all the entries in
the database are deleted. These two converting and deleting
processes run in the background, so the user can continue
to perform other activities while converting entries into CSV
files.
B. System Communication
All the interactive items in the front end are saved in
resources with their ids, and each method in the back end can
find interactive items with their ids. Once they are connected,
each method in the back end can change their properties
at any time. The initial design of the app had the SQLite
data sent to a remote server via a Bluetooth pairing and
internet connection, but this was problematic due to slow Fig. 1. ZenWatch App Interface
transmission rate, so we redesigned it to store the entire test
data set on the watch until it could be pulled later. This
eliminates the need for a local WiFi connection, or even a A. Preprocessing
Bluetooth pairing with a smart phone; this quality is useful Data were segmented by the sliding window method into
for the overall study, as further iterations of the app can now windows of 2 second length with 1 second overlap, as
be set up to recognize activity on-the-fly, without dependence previous studies have shown this to be a good window length
on other devices. Instead, the watch is later connected to on average [4]. The data were also smoothed and then feature
a computer via USB and the CSV files can be extracted extraction commenced.
from the ZenWatchs internal storage on Android Debug The focus of our feature extraction was done in the time
Bridge (adb) shell environment. Later iterations of this app domain. This was with the intention to maximize perfor-
could even remove this component, so that the data can be mance speed, so that in the future the model may be applied
processed in real-time on the watch, removing the need to to real-time scenarios. Because we believe the sensor data
transfer data to any other device. will not be particularly repetitive when compared to less-
sedentary activities such as walking or running, we do not
C. App Design believe there would have been much net benefit outside of
The user interface is primarily comprised of four textview the time domain.
widgets to display the status and sixteen button widgets to We calculated a variety of features based on the sensor
perform an action. The four textviews display the users data in order to assist in specific activity recognition. As one
unique identification and which context the user is doing. of our detection focuses, statistical metrics were collected
The buttons provide options to change the user id, stop the across the sensors to assist in determining subject position
context, write all the database entries into CSV files, display (sitting, standing, and lying down). These features include
the total number of entries in the local SQLite database, and mean and variance. The goal of this portion is to see if
select the context which the user will perform (to establish the user is sitting, before more complex sitting activities are
a ground truth). determined. Step detection turned out to also be a useful
IV. METHODS metric here if a step was detected, the user was not likely
to sit (depending on reliability of the sensor).
As the ZenWatch collected test data, it was stored locally
Envelope metrics were calculated as well, particularly for
in the watch for the duration, until a later time when it
sensor data. Such features include max, min, and range. We
could be processed. Each data sample was stored in the
believed these would especially be useful with respect to
local SQLite database until replication, when the database
sound amplitude for determining the environment. This was
is converted into a CSV file. On transfer to a computer,
expected to particularly factor into detecting activities such
each sample was split and placed into one of two CSV files;
as talking, phone use, and watching television, as well as
one file containing the output sensor data (with sensors in
differentiating between the silent and talking activities (i.e.
columns, and each row corresponding to a single point in
standing versus standing while talking). A zero-crossings
time), and one file containing the user id, the label and time
feature was also used for similar purposes, calculated per
point for the corresponding sensor data.
window by
Following data collection, the data were preprocessed
ZCw = i=n f (Si , Si+1 )
through PASDAC prior to classification and evaluation.
Specifics are described below. where S is the data point at time i, and f is a function such
that V. EXPERIMENTAL SETUP
1, if x y 0 The initial round of user testing, which was conducted to
f (x, y) =
0, if x y > 0 build our model, consisted of ten subjects, with testing done
A mean-crossings feature was also calculated for each data in their home environments. The decision to test in users
column, counting the number of times the signal crosses the homes was made to ensure that the resulting data represents
mean rather than the zero axis. the most realistic actions the user would make, given their
personal situations.
Values were also calculated which measure the instanta-
neous change in sensor data. Each point was calculated by
Vi Vi1
Vi =
2
Each of these rates also has the above statistical features
calculated.
B. Classification
We used machine learning algorithms to train our activity
classifier, utilizing 10-fold validation. Though we evaluated
the results of a few algorithms, including Nave Bayes, C4.5
Decision Tree, K-Nearest Neighbors, and Random Forest, we
present our model based on a C4.5 Decision Tree, similar to Fig. 3. Example setup of User Testing, showing the Writing/Desk Work
previous work [14], due to the easy interpretability. A small (sitting) activity
portion of our tree can be seen in Figure 2.
Each user wore one Asus ZenWatch 1 on their left wrist.
The left wrist was chosen based on related works [2], which
found the non-dominant wrist to be better used for AR tasks.
For consistency, the left wrist was chosen for everyone, as
the right hand is more commonly the dominant one. In
their home environments, the users performed each of the
sitting activities we aimed to differentiate for five minutes
each. They also performed comparative standing activities
for differentiation, to generate a total of one hour of data
per test subject. The resulting datasets were used to build
the model and for initial validation.
A second round of user testing may be conducted for
further model validation. This would be done by having
test subjects wear the ZenWatch outside of the lab as they
perform their daily routine activities. During this period,
whenever the AR model detects that the user is both sitting
and has begun one of our target activities, the system will
send a message to the user, asking for confirmation of the
detected activity. At the same time, it may also ask if the
user would be willing to take a break from sitting during this
activity. It will not send the follow-up question every time, so
Fig. 2. A visualization of a part of our decision tree [10]
that the user does not become disinclined to continue using
the system. The reasoning behind this secondary validation
testing is to show the AR models usefulness in the field.
C. Evaluation
VI. RESULTS
We evaluated our system based on the choice of features
(based on information gain), the choice of machine learning Based on the initial round of user testing, our results
algorithm (based on accuracy and/or F-measure), and most show the following with regards to the evaluation criteria
importantly, the effectiveness of our system. The final eval- outlined in Section IV-C. Our most useful features, as shown
uation was based on a confusion matrix, total accuracy, true in Table II, were found to focus heavily on those that look
positive rate, false positive rate, and F-measure. The results at change in sensor data over time. These were calculated
of this evaluation setup are presented in Section VI, and in Weka, which is machine learning software for knowledge
evaluation formulas are summarized in Appendix 1. analysis, using 10-fold validation, with the reported values
TABLE II
C ONFUSION M ATRIX
Activity 1 2 3 4 5 6 7 8 9 10 11 12
Television 1 970 2 1 0 11 5 0 0 1 0 0 0
Computer Use 2 0 944 6 4 3 3 0 0 0 4 1 0
Reading 3 2 0 692 0 3 2 0 1 0 0 1 0
Writing/Desk 4 2 2 0 628 1 3 0 1 2 1 0 1
Phone Use 5 13 4 2 0 973 4 7 0 0 0 2 0
Talking 6 3 0 0 0 4 989 5 1 0 2 2 0
Other 7 1 0 0 0 2 6 929 0 0 2 0 0
Standing 8 1 0 0 2 0 1 1 921 2 0 6 2
Walking 9 0 0 0 2 0 2 0 2 922 0 2 28
Lying Down 10 1 1 0 0 1 5 1 1 2 742 0 0
Standing (Talking) 11 0 0 1 1 1 3 0 6 0 0 1020 0
Walking (Talking) 12 0 0 0 1 0 0 0 1 41 0 1 988
TABLE IV
Fig. 4. Example setup of User Testing, showing the Lying Down
comparison activity C LASSIFIER ACCURACY