Está en la página 1de 5

Security Practice: Design, Adoption, and Use of Technology for

Collaboratively Managing Sensitive Personal Information


Laurian Vega
Department of Computer Science, Virginia Tech
Laurian@vt.edu

ABSTRACT
There is a need in socio-technical systems, and more specifically medical informatics, to study how trust
and privacy can affect the ad hoc negotiation of security rules and how they are managed in practice. For
my interdisciplinary dissertation I propose to study how people collaboratively work to manage private
information in the domains of childcare centers and physicians’ offices. These locations are places where
private information is collected for multiple purposes (eg. tracking care, receiving payments). However,
there has been little work that has examined how these needs are reflected or balanced in the acts of
privacy.

Building off of the preliminary work of observations and interviews, my approach to this topic is the use of
mirrored ethnographies in each setting. The goal is to collect data on observed breakdowns in the policies
for these groups. The data will be analyzed using a phenomenological approach to understand the
experiences of security breakdowns: what was the cause, how breakdowns were responded to (e.g., bending
policies, creating new policies), who was involved, and how privacy and security were maintained, or not
maintained. Breakdowns may be as small as a file missing a piece of information, or as large as a missing
child – as observed in preliminary studies. The outcomes from this work will be thick descriptions and
categorizations of security system breakdowns along with near-future scenarios to depict aspects of future
innovations. A goal of this work is to build on the literature in the areas of medical informatics and usable
security in order to design prototypes in childcares and physician’s offices.

The preliminary analysis has produced three loci where implicit and explicit norms influence security and
privacy: (1) human-mediated access management explores how a filing cabinets and a computers location
were physical situated and mediated by one person within each center, thereby negotiating access; (2)
information duplication explores the role and use of having information in multiple forms and across
distributed locations; and (3), community of trust explores how personnel coordinate and co-construct
knowledge of the context to assess for nuances in polices. These findings and work represent novel and
practical evaluations of security in context to look at factors related to security and privacy that affect
technological adoption.

INTRODUCTION
Health records intrinsically contain sensitive personal information. With the growing use of electronic
health records and other electronic communication methods, there are parallel growing patient concerns in
regards to security and privacy of their sensitive personal information that extends beyond HIPAA. In 2009
thirty-nine percent of physicians were communicating with patients over email, secure messaging services,
or instant messaging and ninety-nine percent of doctors were using the internet [2]. What is not being
considered, though, with the increase in technical appropriation are the patients concerns in regards to their
privacy. According to the California HealthCare Foundation, one in fourteen users are using personal
health records. However, privacy is an enduring concern staying relatively high from a 2005 survey in
reference to the adoption and use of these records. Seventy-five percent of participants who said that they
did not want to use electronic health records reported that it was because of privacy concerns – particularly
their sensitive personal information being stored online [1]. This need is echoed in a review of research on
personal health records in 2008 of 100 papers on Personal Health Records (PRH) by Kaelber et al. They
found that privacy and security was one of the seven factors that not only affects adoption but is also an
emerging and important research area [12]. All of this work presents a need to understand the security and
privacy of health records.

In parallel to the work of health informatics is a growing body of work at the conjunction of Human-
Computer Interaction (HCI) and security, appropriately termed Usable Security. Usable security is a
movement within the security sub-domain of computer science that recognizes that security has two parts:
the computers and humans. Usable Security’s proposal is that if security is hard to manage it is not because
the users are at fault; it is because the security mechanisms are incongruent with the user’s primary task [4].
To understand the user’s task involves understanding the user’s needs, practices, and values. Additionally,
with computing systems becoming increasingly ubiquitous, they will become more involved and integrated
into everyday work; computers are no longer solitary machines under the computer desk. They are the
tablets that physicians carry, or the bluetooth enabled diabetes glucose meter that patients use. Making
these systems work in a way that respects the users needs but also respects the clients privacy in a secure
way is a primary goal of usable security.

Unfortunately, a good amount of work in the realm of security focuses on the creation of rules or policies.
The problem with security policies is that they are often only secure in principle. They are seldom secure in
practice [7]. Practice is what happens in the moment; it is the activity; it is what is actually done. In this
space there is a tension between work practice and security. There has been a plethora of research that has
demonstrated that when security policies or mechanisms are not appropriately designed to support work
practice, security breaks down (e.g., creating work-arounds such as writing passwords on post-it notes, or
as was observed in the pilot studies – shouting passwords) [4, 9]. When a breakdown occurs, though, in a
social system, workers do not stop doing work. They create special cases or methods that allows them to
continue – bends in the formal policies. In this sense, social systems are intrinsically flexible. When we
start to think about electronic systems, the reverse is true: electronic systems work according to pre-
encoded, deterministic rules.

My dissertation merges the theoretical frameworks presented from the field of HCI, the standpoint of
usable security, and domains of healthcare and childcare. The domains of healthcare and childcare are two
instantiations of a similar socio-technical system. The type of socio-technical system I have been and will
be studying is one in which groups coordinate and manage their clients’ private information through
physical and technological mechanisms. In childcare centers workers care for and manage the enrolled
children and also the enrolled child’s personal information. In medical centers workers manage the patients
health along with the patients health information. The use of two areas allows for generalization across
similar work environments while also exploring different dimensions (e.g. routines, legislation).

For my study it was critical to select areas where privacy was being managed in practice. The settings of
childcares and medical practices were selected because they are similar in their goal to manage and respect
sensitive personal information. Childcares were selected because they are a setting where access are more
easily granted, regulations are less pervasive, and sensitive information is ubiquitous to the caretaking job.
Privacy concerns may not be as highly prevalent as physician’s offices given that the information being
stored is not life critical. Medical practices were selected because of the daily use of information, the highly
regulated nature of its use, and because there is a life-critical aspect to having correct information. By
studying both areas my desire is to better understand the area of managing private information. Areas that
were considered but not used were employee files, criminal files, student files, and client files.

While it is not surprising that there is a difference between expected by-the-book processes and actual
practice, this issue is of particular importance in regards to security. ‘Security’ as a theoretical construct
and as a focus of technological development is an area where failure has a high associated risk. Systems,
both social and technical, that are both correct and dependable are an increasing problem with the growing
reliance on rigid security systems [10]. Additionally, the study of these is becoming ever more necessary as
we strive to design secure and usable systems for childcares and medical practices that are increasingly
adopting digital documentation systems [11]. It is for this reason that there exists a need to study and
understand how socio-technical systems manage security practice.
Responding to the need for research in this area, my research question is: how do socio-technical systems
that use sensitive personal information manage work-practice breakdowns surrounding the implicit and
explicit rules of process? I have further broken this down into three sub-questions:
· What are the implicit and explicit rules surrounding how medical practices and childcares handle
sensitive personal information?
· What breakdowns happen when the explicit and implicit rules are not followed?
· How are breakdowns accounted for, negotiated, and managed in socio-technical systems where
sensitive personal information exists?
Through understanding the breakdowns that are encountered in the use of both social and technical
systems, new systems can be designed that are flexible, have socially negotiated policies, and result in more
reliable and secure day-to-day practice. Specifically, by understanding what causes security work arounds
and how users respond to them, technological design can start to create systems that result in less end-user
frustration and fewer discrepancies between what the user needs for their practice and what is necessary for
security and privacy.

METHODOLOGY
With recently granted approval from the Virginia Tech Institutional Review Board, active-participant
observations are being employed. For these, data will be gathered from observing work at both a childcare
and medical centers. The pilot data and resulting data from the observations will form the basis for the
overall findings of this study. Daily observation logs will be kept along with appropriate pictures of
representative artifacts. Interviews with audio recordings will be transcribed verbatim. Breakdowns will
then be analyzed using a phenomenological approach to produce an emergent understanding of how the
socio-technical system employs security in practice [16]. The use of observations and interviews from key
stakeholders will be used to triangulate and discern practices.

Qualitative methods, such as interviews and observations, are critical as investigation mechanisms because
they account for the reasons and motivations that may go unreported on surveys. These methods will allow
me to see when and where technology fails in maintaining security and propose more ecologically-valid
solutions to address these breakdowns. The second reason for using qualitative studies in this space is
because of the lack of knowledge in regards to actual security and privacy practice. Prior work has
examined and asked about what happens when security policies are not adequate for the solution. Little
work has looked at what actually happens. Qualitative methods that look at the specific in order to abstract
to the general are appropriate for the research goals of understanding security breakdowns.

Given the intrinsically sensitive nature of the data I am collecting, strict privacy protocols have been and
will be adhered to. There are national and state regulations to protect child and patient information (e.g.,
HIPAA, FERPA). For my study, identifiers of the participants are stripped from all documentation except
for the informed consent and a document listing identifiers, names, and contact information. This includes
secondary participants such as patients, children, and caregivers. All data is stored on password-protected
computers, on external hard drives that are locked in cabinets, and on data print-outs that have removed
identifying information but are still locked in cabinets. All original documentation apart from the informed
consents has been shredded. It is not our intention to collect information about particular children, care
givers, or patients. However, when names are encountered, they are given unique identifiers in the data to
protect identity and anonymity.

RESULTS FROM PRELIMINARY STUDIES


Four pilot studies were conducted to explore security issues involved in the practice of collaborative
sensitive information management: 12 interviews of childcare directors, 16 interviews of physician office
directors, follow-up interviews with 4 childcare directors, and two to three observations in 4 childcares. All
interviews and observations were transcribed. All participants were from the southwest area of Virginia.
The directors were recruited through a comprehensive list of all area businesses; the response rates were
55% for childcares, and 26% for physician offices. Roughly 1,500 pages of data have been collected:
approximately 750 pages of transcripts, 200 pages of observation journals, 123 forms, and 125 pictures.

The preliminary analysis has produced three loci where implicit and explicit norms influence security and
privacy. The first, human-mediated access management, explores how a filing cabinets and a computers
location were physical located and mediated by one person within each center, thereby negotiating access.
The second, information duplication, explores the role and use of having information in multiple forms and
across distributed locations. The last, community of trust, explores how personnel coordinate and co-
construct knowledge of the context to assess for nuances in polices. More information about these can be
found in [15].

RELATED WORK & RESEARCH CONTRIBUTION


There has been a dearth of research examining the day-to-day practices of childcares and their relation to
information security. However, the work of Kientz et al. has explored how to design a technological
solution for information that is stored and managed about children [13]. One important finding from the
study was that doctors were the most trusted source of information about a child’s development. This
finding speaks to the conceptions that parents have about authoritative information, thus impacting what is
shared and documented about their child. While this work embodies some of the same user needs as my
study (i.e., mass amounts of information, data recording, etc.), it does not focus on the security and privacy
of practice. Additionally, Kientz’s work focuses on how parents manage the documentation, while I am
focusing on how childcares co-manage documentation with parents and other secondary caregivers.

In contrast, there has been extensive research on security and privacy with some focus on the medical
settings. Prior work has examined the security of private information [3], how documentation and
articulation work supports collaboration in a medical setting [8], how to manage the mobility of medical
collaboration [5], and the use and creation of multiple surfaces can support collaboration and management
– with a specific focus on supporting work practice [6]. The work of Reddy and Dourish [14] is
representative of how practice and context can affect information dissemination. In their paper temporal
rhythms are proposed to explain community patterns that healthcare workers follow in seeking, providing,
and managing information. My work, instead, focuses on the inextricable relationship between social and
technical mechanisms that are used in the medical work practice to negotiate ad hoc security and privacy
needs.

It is the combination of using these two areas of focus that will provide novel insight into day-to-day
security practice of managing sensitive personal information. To date, no one has examined this area of
research to see how people manage policy breakdowns and ad hoc response. From my work there are two
contributions that this research will make: theoretical and technical. First, this work will benefit the health
informatics, the usable security, and the HCI community by detailing a deep exploration of how
communities manage explicit and implicit policies. The results from this body of work will be a set of
properties that will help the design community to create technology and tools to support secure work
practice. Additionally, there has been a dearth of research studying how groups manage and coordinate
security and put these constructs into practice. This work will add to that body of literature and
understanding.

The product of this research will be a set of near-future scenarios depicting the positive and negative design
of technologies in response to the categorization and analysis of the breakdowns. From these scenarios,
other masters and Ph.D. students in my lab are starting and will continue to design prototypes that will
further explore the design of flexible and negotiated security policies. For example, we are currently
considering how to design technologies that recognize a situation in which security policies should be
mediated. It is easy to imagine a scenario in which a patient enters an emergency room, the doctors need
immediate access to all medical information, and as such, all privacy policies are removed from an
electronic health record. At a basic level, similar scenarios suggest technological solutions such as
temporary access or decaying access rights to information that could be implemented to allow for flexible
and negotiated privacy management. Outcomes such as these will further the contribution of my research.
PROBLEMS FOR CONSORTIUM
My work is necessarily interdisciplinary. I would greatly benefit from not only attending the AMIA
conference, but from a panel of experts in medical informatics to help (1) understand novel ways that my
research can be applied and be useful for the AMIA community, (2) understand the security and privacy
needs as discussed and studied by AMIA colleagues, and (3) discuss alternative evaluation methods that
can be applied for triangulating and generalizing my findings.

REFERENCES
(2010) New National Survey Finds Personal Health Records Motivate Consumers to Improve Their Health. Available
from: http://www.chcf.org/media/press-releases/2010/new-national-survey-finds-personal-health-records-
motivate-consumers-to-improve-their-health.
Taking the Pulse. 2010, Manhattan Research: New York, NY.
Adams, A. and A. Blandford, Bridging the Gap Between Organizational and User Perspectives of Security in the
Clinical Domain. International Journal of Human-Computer Studies, 2005. 63(1-2): p. 175-202.
Adams, A. and M.A. Sasse, Users Are Not the Enemy, in Communications of the ACM. 1999. p. 40-46.
Bardram, J.E. and C. Bossen, Mobility Work: The Spatial Dimension of Collaboration at a Hospital. Computer
Supported Cooperative Work (CSCW), 2005. 14(2): p. 131-160.
Bardram, J.E., J. Bunde-Pedersen, A. Doryab and S. Sørensen. CLINICAL SURFACES --- Activity-Based Computing
for Distributed Multi-Display Environments in Hospitals. in Proceedings of the 12th IFIP TC 13
International Conference on Human-Computer Interaction: Part II. 2009. Uppsala, Sweden: Springer-
Verlag.
Bellotti, V. and A. Sellen. Design for Privacy in Ubiquitous Computing Environments. in Proceedings of the Third
Conference on European Conference on Computer-Supported Cooperative Work. 1993: Kluwer Academic
Publishers.
Bossen, C. The parameters of common information spaces:: the heterogeneity of cooperative work at a hospital ward.
in Proceedings of the 2002 ACM conference on Computer supported cooperative work. 2002. New Orleans,
Louisiana, USA: ACM.
Dourish, P., E. Grinter, J.D.d.l. Flor and M. Joseph, Security in the Wild: User Strategies for Managing Security as an
Everyday, Practical Problem. Personal Ubiquitous Computing, 2004. 8(6): p. 391-401.
Flechais, I., J. Riegelsberger and M.A. Sasse. Divide and Conquer: The Role of Trust and Assurance in the Design of
Secure Socio-Technical Systems. in Proceedings of the 2005 Workshop on New Security Paradigms. 2005.
Lake Arrowhead, California: ACM.
Jha, A.K., T.G. Ferris, K. Donelan, C. DesRoches, A. Shields, S. Rosenbaum, et al., How common are electronic health
records in the United States? A summary of the evidence. Health Affairs, 2006. 25(6): p. w496-w496.
Kaelber, D.C., A.K. Jha, D. Johnston, B. Middleton and D.W. Bates, A Research Agenda for Personal Health Records
(PHRs). Journal of the American Medical Informatics Association. 15(6): p. 729-736.
Kientz, J.A., R.I. Arriaga, M. Chetty, G.R. Hayes, J. Richardson, S.N. Patel, et al. Grow and know: understanding
record-keeping needs for tracking the development of young children. in Proceedings of the SIGCHI
conference on Human factors in computing systems. 2007. San Jose, California, USA: ACM.
Reddy, M. and P. Dourish. A Finger on the Pulse: Temporal Rhythms and Information Seeking in Medical Work. in
Proceedings of the 2002 ACM Conference on Computer Supported Cooperative Work. 2002. New Orleans,
Louisiana, USA: ACM.
Vega, L., Security in Practice: Examining the Collaborative Management of Personal Sensitive Information in
Childcares and Medical Centers, in Computer Science. 2010, Virginia Tech: Blacksburg. p. 104.
Young, M.L. and F.C. Tseng, Interplay between physical and virtual settings for online interpersonal trust formation in
knowledge-sharing practice. Cyberpsychology & Behavior, 2008. 11(1): p. 55-64.

También podría gustarte