On this ESI Survival Guide extended interview installment, we are joined by Debbie Reynolds – The Data Diva – Founder, CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting, creator of the Debbie Reynolds Consulting LLC YouTube channel, creator of The Data Diva Talks Privacy podcast and overall data privacy extraordinaire. Debbie sits down to talk with us about her career path, her current endeavors and a wide range of cutting-edge data privacy and emerging technology issues.
As we push through the electronic wilderness, we touch on various topics including the evolution of data privacy concepts over the years; the Internet of Things, Bodies and Toys; the secrets of smart speakers, the utility of tin foil hats, the shenanigans of IoT manufacturers, biometrics, movies about biometrics, Taylor Swift’s use of biometrics, the possibility of comprehensive federal data privacy regulation, racial and gender bias in facial recognition technology, Debbie’s Data Privacy Survival Kit and more.
Below are the highlights of the various topics of our conversation, along with various Survival Tips and other signposts. And I should highlight, if you are someone who gets a little intimidated by long, awesome video conversations, do not fret, we have included YouTube video timestamps to help get you to your happy place. Check them out on the ESI Survival Guide YouTube Channel (the one with the logo), and support us by sharing, commenting, liking or, better yet, subscribing!
Following our lead-in and our disclaimer, I introduce our very special guest Debbie Reynolds – one of the most prolific data privacy, cyber breach and emerging tech experts in the industry. I could easily spend the entire session listing off her accomplishments and the avenues she travels in the data privacy space. We are super excited to have her with us, and she was both gracious and cool enough to take the time to talk with The Guide.
Debbie has been a professional in the data law space for over 25 years on both the commercial and academic fronts. Just to give an overview of her professional greatest hits:
- She is a world-renown technologist, industry thought leader, internationally published author, prolific multi-media expert, and more.
- She has authored and contributed to over 100 publications like Today’s General Counsel, Bloomberg Law and Westlaw Journal.
- Debbie has been interviewed and quoted across media outlets like LegalTech News, Law360, Bloomberg Big Law Business, and more.
- She did a spot on PBS!! It doesn’t get much more awesome than that!
- Debbie is the Global Data Privacy Officer for Women In Identity, Privacy Compliance Lead – AR VR XR Spatial Computing Privacy Framework Evaluation Committee for XRSI, Advisory Council Member for Technology Company Titanium, Inc. Technology & Cybersecurity Committee Member of the New York State Bar Association (NYSBA), Founding Executive Member of Digital Directors Network (DDN), and serves on more than 23 Advisory Boards.
- She is a forward-leaning academic and has served as an Adjunct Professor at Georgetown University and Cleveland Marshall College of Law.
- Debbie was recently named a Top 20 Global CyberRisk Communicator of 2020 by the European Risk Policy Institute, among several other honors and awards.
- AND she is a social media influencer on her YouTube channel – Debbie Reynolds Consulting LLC, one of the most comprehensive channels on Data Privacy on the web.
Now HOW is it possible to be as prolific as Debbie? Well, she offers our viewers some insight into her routine, which includes watching ONLY ONE HOUR OF TELEVISION PER WEEK! Yes, you hear/read that correctly. For her that hour was especially easy to choose when Game of Thrones was on. Now one hour per week is an amazing feat for sure, and one that your host at ESI Survival Guide could never possibly accomplish! Debbie goes on to offer another secret to her success – pushing her professional efforts forward incrementally every single day.
And with that introduction, we launch into the electronic wilderness.
Can you tell us about the evolution of Debbie Reynolds into The Data Diva, and if you could add, what was one thing you did that kept you moving forward during that evolution?
Her path started in 1997, when Debbie’s Mom was reading Caroline Kennedy’s and Ellen Alderman’s book The Right to Privacy. Her mother’s interest in the book, and the way the authors laid out the privacy issues of the time, ignited the spark in Debbie. From there she began to follow various privacy laws, cases and regulatory developments around the world. The rest is history. Now this was before the Internet was as ubiquitous as it is now, but as technology evolved around her, Debbie’s interest in this space quickly grew.
Debbie started her career in library science working on digital transformation projects and then transitioned to helping Fortune 500 corporations create databases for legal documents. This is even before the “e” in eDiscovery folks, a true pioneer indeed! In tandem with this effort, Debbie found herself advising international companies on cross-border data flows. The years went by, and as the world got closer and closer to the GDPR’s effective date, Debbie’s expertise made her an easy choice for lectures, speaking engagements and other opportunities related to GDPR and all things data privacy front (including an opportunity with one MEGA CORP – watch the video to get the scoop on that one). The Data Diva was now in full effect!
A COUPLE OTHER DEBBIE REYNOLDS FUN FACTS WE LEARNED:
- Debbie is a webmaster.
- She taught herself how to use computers through graphic design.
Caroline Kennedy’s and Ellen Alderman’s book The Right to Privacy was published in 1997 and makes a strong case for privacy as a human right as opposed to simply being a consumer rights issue. It also shows through example after example how fleeting privacy can be and how quickly we can lose it. How has the data privacy landscape evolved since you first read this book? Do you think we are in a better place, a worse place or just a different place?
Well for those hoping that all the problems of the book have since been resolved, we apologize for being the bearer of bad news. Debbie talks about how the problems presented in the book still exist, and how the way in which technology has evolved, as well as how data is handled in the Digital Age, complicates these issues significantly. For example, in the late 90s, the privacy concerns may have been a person walking past your window and looking into your house, or maybe someone jumping over your fence to see inside your house. Where do you think the privacy lines were drawn in this context? Fast forward to today. Now we have to be concerned with strangers recording us with their smartphones, our phones tracking us, drone technology and more. The basic lines are still there, but now it is much more zigzaggy.
After reminiscing about the ole AOL CDs we both used to receive in the mail, we talk about how even living in a NY high rise doesn’t immunize you from privacy issues – enter those drones! Technology seems destined to consistently outpace our full understanding of these privacy issues. However, as we evolve, and as new generations of tech savvy populations emerge, we can all hope to at least engage in a game of privacy/technology leapfrog from time to time. Maybe we could simply end up throwing up our hands and tossing our privacy rights into the wind, but if it helps provide any relief, current development in law and technology do indicate, in our collective opinion, that we are moving towards more concern over these issues and a desire for more control over our private lives.
Now let’s focus on the “Internet ofs” hierarchy: Internet of Everything, Internet of Things and Internet of Bodies. First, for our viewers, can you describe each of these concepts?
At its very core, the Internet of Everything/Things (IoE/IoT) refers to a device that has Internet connectivity. Your Ring doorbell, iRobot Roomba vacuum, your smart refrigerator – these are all IoT devices. By the way, did you realize that nifty automated vacuum is mapping out your home and storing the information?? [Queue Twilight Zone music.]
When we think of an Internet connected device, we often think of something with a screen with which we can interact, a device over which we have some scintilla of control. But think for a second about smart speakers and other devices that you may not even be aware are actively collecting your data. These devices, even the one you think are activated by a codeword, could be listening to you way more actively than you realize! [Queue Twilight Zone music again.] And unfortunately, those dense Terms of Service rarely provide little comfort.
Turning to the Internet of Bodies (IoB), Debbie discusses her special interest in this area. She focuses on IoB in the context of COVID-19 and the acceleration of technology as it relates to medical data. Just think of how hard it was to get a virtual appointment with a doctor pre-COVID. Now medical professionals are in a position to diagnose patients at a distance, which means patients often have to use wearables that transmit vital signs and other medical data to their doctor or physician.
Debbie classifies patient data and IoB into two camps. The first is when we visit or interact with a medical professional. In the patient/provider content, our data is covered by certain privacy protections. The second camp involves data that is collected by IoB devices in more of a consumer context. Data in this context does not benefit from those same protections because it falls outside of that traditional medical relationship. Think of all the data that your Fitbit collects and stores. Because this data falls under a consumer umbrella, this is very little protection with regards to privacy (and I’m paraphrasing how Debbie really feels about the topic). The companies behind these devices do not have the same obligations to protect your data as those in the medical profession. You really must consider the following questions about your personal information:
- What data is being collected?
- Why is your data being collected?
- Who is collecting your data?
- How is your data being used?
Even if you say to yourself, “I have nothing to hide!” Well, you don’t until you do!
Imagine a scenario where you are using a wearable to track medical data, and that data is then sold by the IoB company to a potential employer. Then, lo and behold, you end up not landing that coveted interview because your vitals cause insurance concerns for that company. You would never know that such discrimination is even happening behind the scenes. [Now that Twilight Zone music should already be triggered in your brain!]
SURVIVAL TIP! When it comes to consumer IoT devices, the name of the game is Caveat Emptor! Educate yourself!
On a later episode of The Guide, we are going to get into the legal considerations regarding data and the Internet of Things, but I really want to jump into the privacy considerations for IoT and IoB. On your YouTube Channel you talk about both extensively. There are IoT devices that are more passive that collect data through various types of sensors and integrated functional software like your smart thermostat, security system, smart refrigerator, smart watch or fitness tracker, and then you have the IoT devices with which you engage, your smart speaker, your smart TV, etc. You note how smart speakers, TVs and other similar devices may be recording much more than just your direct commands and content, which is a pretty sketchy thing to consider. Are there distinct privacy issues consumers should be aware of that are attached to more passive sensor-based IoT devices as opposed to the ones that require more direct interaction, or do the issues overlap for all IoT devices? What can I do proactively as a consumer to protect myself?
SURVIVAL TIP! First and foremost, to protect yourself, you need to know what your IoT device is doing. I know it is painful but READ THOSE TERMS OF SERVICE!
Whether purposeful or just due to the nature of the beast, it isn’t like IoT manufacturers, nor our friends and colleagues in the legal profession, make it fun or easy to read IoT Terms of Service, but it is very important that you know what your device is doing.
IoT companies of course want to sell their devices to the most consumers possible. To that end, they want to include as many features as possible, which then shifts the burden to the consumer to configure the settings and alter/change/turn off any unwanted features. That can be a hefty burden folks.
SURVIVAL TIP! Use caution when putting smart speakers in your work area if you are likely to discuss confidential information. Once that data is heard, recorded and stored, you have diminished your ability to control it.
SURVIVAL TIP! In the same vein as your workspace, exercise caution when putting smart speakers into a child’s bedroom or play space. You can be exposing your children to hackers.
At this point in our conversation, Debbie offered up a very interesting insight – one would think a privacy professional would constantly expose themselves to IoT and related devices and take full control of them through their settings and features. However, it is equally likely that the smartest privacy professionals in the land exhibit the most caution. Maybe they even avoid IoT devices all together, or at least they act VERY cautiously, ask tons of questions and approach such devices with a healthy dose of skepticism. This skepticism and concern can drive curiosity surrounding IoT devices and topics. Debbie closes this section by talking about the ole trusty Tin Foil Hat! Now THAT would have made for an amazing thumbnail photo for YouTube!
CONSIDER AND COMMENT: Consider a lawyer’s duty to protect confidential client information. What are the ethical implications regarding smart speakers at IoT devices located in the home or office that could record otherwise private conversations? Feel free to comment with your thoughts, ideas and experience.
Doesn’t IoT simply make us more vulnerable to hackers and various attack vectors? Hackers can access corporate systems through CCTV and AC units of all things! In 2018, there was a case where hackers stole a bunch of high roller information and other data from a casino through a smart thermometer monitoring water temperature in a lobby aquarium. Even as security gets better, hackers get smarter. How can the security issues associated with IoT be successfully addressed?
There is a vast gap between consumer understanding of IoT products and how they can protect their privacy and what the manufacturers are doing. Debbie feels that manufacturers don’t have enough skin the game when it comes to consumer use. Yet again, we see the concept of let the buyer beware rearing its head. Manufactures are aware that many, if not most, consumers will end up using the default setting for an IoT device. This default stance is typically one where every setting is enabled in a way that ultimately benefits the manufacturer (they couch this in terms of “enhancing the user experience”). Consumers must consider the risk of forgoing privacy vs. the reward of using the device.
We are sharing information now, especially in the context of the pandemic, that we would have never shared before. Debbie reminds us that the age-old greeting, “Hi, how’s it going?” used to be a rhetorical question. Now – not so much. She does inject a bit of a silver lining in stating that she is hopeful that people seem to be caring more and more about their information and are asking the right questions. Being curious, asking questions, understanding the landscape and consequences – these are very strong mechanism you can use to protect yourself.
Manufacturers want to get you excited to buy a product now! And they often succeed by approaching consumers on a psychological front. They want potential byers to focus on the benefits of the product, not the potential privacy harms involved. It is important to note that while a product’s key features may present immediate benefits, any harm related to data privacy is often very delayed.
At the end of the day, there are psychological tricks afoot here and manufacturers use them extensively. Even if you did try to control a device or app’s privacy settings up front, once you being to use the device or app, you may be prompted to make changes to those settings in order to access certain functionality. In effect, you end up undermining your original intent to protect yourself because you want that feature now!
Shenanigans are happening!
CONSIDER AND COMMENT: We discussed that default settings often involve having all the privacy related features that are beneficial to the manufacturer enabled, and this is often painted as benefitting the “customer experience.” This is essentially an “opt out” regime. Consider flipping that model upside-down and requiring IoT device manufactures to have all such features turned off and the consumer must “opt in” after some type of tutorial on what data the device collects, how it is used, where does the data go, the potential privacy risks associated with using the device, and other related questions? Maybe manufacturers should have to conform to Privacy by Design, where there are default settings that allow end users to have access to full feature sets without impacting privacy. In the end, is this an issue that should work itself out in the marketplace, with consumers addressing privacy concerns with their pocketbooks? Feel free to comment with your thoughts, ideas and experience.
Debbie goes on to contrast consent in the European context with the more noticed-based approach to privacy in the U.S. We also discussed the nature of choice when you only have one option. How meaningful is choice when your options are limited? Are you simply faced with a situation where your choice is not to engage the device at all? As these technological become much more prevalent and more integrated with daily life, consumers will begin to face some real dilemmas.
We end this section by discussing an “analog” version of the trickery involved with IoT manufactures. Consider the following scenario (whether fully on point on not): We have all experienced that situation where we’ve been on hold with customer service at a carrier for SO LONG that eventually, we lose the war of attrition, hang up and simply deal with whatever situation we were calling about in the first place for another day or so. Whether this practice is on purpose or not, the result is the same – we accept the unacceptable. Is accepting the unacceptable simply the position we all currently face with regards to the privacy settings in IoT devices?
Debbie then highlights a very interesting tactic that was new to me (and something I will put into practice for sure) – a small means by which we can empower ourselves in the one-sided relationship that exists between manufacturers and consumers. Debbie, who shops for sport, gave us the inside scoop. If you add something to an online shopping cart and then you don’t buy it, often the seller will contact you and offer you a discount. Now if THAT right there is not a useful survival tip to help you navigate the electronic wilderness, then I’m not sure what is.
Now I want to segue back to the Internet of Bodies and biometrics. First, this resource is just as much for tech neophytes as it is for experts. Let me present a basic question: What are biometrics, and are there different categories of biometrics?
Biometrics are essentially the capture and measurement of an individual’s physical features, which are then used as a means to verify their identify. Consider the following biometric identifiers (these are listed from the most commonly known identifiers to some more unique ones that folks may not have considered before):
- Fingerprints – A tried and true biometric marker; used as a means of identification and authentication since antiquity.
- Facial recognition – Are you maybe using this now to login to your phone or computer?
- Retinal Scans – Yep, we have all seen it in the movies! Though, SPOILER ALERT, in Demolition Man, Wesley Snipes’ character’s use of a disembodied eye to trick a biometrics security system may or may not have worked against today’s technology. While a “dead” eye could work in theory to spoof such a scanner, it is less likely that such a tactic would work against today’s iris or retinal scanners, which increasingly include liveliness detection.
- Gait Recognition – Did you know that how you walk was unique enough to identify you?
- Palm Biometrics – Apparently a newer biometric marker involves running sound waves through the palm of your hand. What next – the unique nuances of an individual’s scream when they are pricked with a nail?
One thing is certain, the use of biometric authentication is exploding. One recent Juniper Research study outlines this rapidly growing trend (e.g., facial recognition hardware alone is projected to be utilized in 800 million smart phones by 2024, and biometrics are projected to authenticate over $3 trillion in payment transactions by 2025 – up from about $404 billion in transactions in 2020).
And who would have though that Taylor Swift was on the cutting edge of using clandestine biometric collection techniques to identify stalkers at her concerts? Watch the video for more on this.
Debbie then shares with us an insider resource she uses to keep on the cutting edge of all things biometrics: https://www.biometricupdate.com/.
We have the Illinois Biometric Information Privacy Act, proposed biometric laws in NY, MD, SC and CA. Federal biometric protections were introduced in the Senate in 2020. What does the biometrics legal landscape look like today?
In describing the legal landscape of biometrics in the U.S., Debbie is hopeful that the legal realm will catch up with the various data privacy issues and biometric wrinkles that abound. However, she is a bit dismayed that many feel our current laws are sufficient. Debbie challenges us to really look at the laws that are on the books and evaluate them against current needs. She discusses a recent video she did on the Computer Fraud and Abuse Act (CFAA) https://www.youtube.com/watch?v=VbOIjWQrsy4, which was enacted by Ronald Regan after he watched the Matthew Broderick movie War Games (a personal favorite of mine with an amazing ending quote from the A.I. supercomputer antagonist Joshua/WOPR about nuclear war: “A strange game. The only winning move is not to play. How about a nice game of chess?”). Though the CFAA was forward-looking in the 80s, 30+ years later we are still looking to that law for guidance. Just consider how old some of the technology-related laws on the books are, even in light of any amendments, e.g., The Wiretap Act of 1968). U.S. law makers need to take stock of how other governments around the globe handle data privacy in the context of commerce. In the end, we may have to rely on consumers voting with their pocketbooks and, hopefully, drive the common-sense regulatory change we need.
Consider how different the data types that are at issue here are from the past, and how data has evolved over time. When your bank account or your Facebook account is hacked, you can change your password, you can update security settings, you can take real steps to try and stop the bleeding. Now consider a biometric data breach (which we will discuss in more depth on a future installment of ESI Survival Guide). I cannot necessarily change my fingerprints or my retina – so what do I do?
Debbie suggests that it is difficult now to get large initiatives accomplished, namely, protections at the federal level, and highlights how states can address privacy issues more quickly. To illustrate this point, she underscores the history of developments in comprehensive privacy regulations in regions such as Canada (PIPEDA) and the E.U. (GDPR), which have roots in more localized provincial and member state efforts, respectively.
Do you think there is ever going to be comprehensive data privacy regulation in the U.S. at the federal level?
Debbie’s answer can be summed up with two phrases:
- “If we get a federal law, it will not be soon.”
- “And it will not be very comprehensive; it will be ‘wafer thin’.”
Debbie goes on to suggest harmonizing state data breach notification laws at the federal level to provide lawmakers with a foundation upon which they can build more comprehensive regulation.
I would like to address an extremely important area of concern with biometrics; I think the most important aspect. Over the past few years there has been this necessary awakening with regards to race in America. The historical inequities that have underpinned our system for centuries are being called out. You might even say they are facing a reckoning, and all of us must actively address how these issues impact life in the black community and other communities of color. I am also trying to be honest with myself, the times in which I grew up, unconscious bias and privilege. I never thought about how biometrics could impact people of color differently until I read your interview with Startpage. Can you comment on how biometrics impact people of color differently, and what are the implications of these differences? https://www.startpage.com/privacy-please/privacy-advocate-articles/privacy-in-action-debbie-reynolds-global-data-privacy-expert
Debbie begins by highlighting a paper co-authored by Dr. Timnit Gebru, computer scientist, advocate for diversity in technology, and co-founder of Black in A.I.; and Joy Buolamwini, computer scientist, digital activist, PhD candidate at the MIT Media Lab, and founder of the Algorithmic Justice League. Their groundbreaking paper, entitled Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, unpacks the bias present in automated facial analysis algorithms based on gender and skin color. The paper can be read here: http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf
The Gender Shades paper focuses on how certain facial recognition programs leverage a six-point color chart to identify people (after reading the above-mentioned paper, I learned that this color chart is called the Fitzpatrick Skin Type classification system). This classification system presents color in a very similar and narrow manner. Because of this narrow color palette, and because people of color are rarely even evolved in the testing of these underlying technologies, the error rate in the misidentification of people of color, and in particular black women, is significantly higher than that of white males.
Debbie also expresses her concerns with technologies in the health care space that are using facial recognition or skin color recognition to diagnose conditions such as skin cancer. Consider the potential for an incorrect diagnosis using such a narrowly focused algorithm!
And just as problematic is that the sale of such technology is extremely unregulated. Consider such facial recognition tools being used by law enforcement where the potential for abuse or misuse could have significant implications. There are cases where facial recognition technologies have been used to match individuals to images of a suspect, which ultimately led to false arrests.
Here are two links related to the outcry for regulations around facial recognition technology. The first offers an overview of the issue from The Regulatory Review, and the second is a very positive piece about Microsoft’s commitment to the ethical use of such technology.
Witness bias and the error rates of these tools, combined with the lack of oversight with how such technology is onboarded and used by law enforcement, create a very troubling situation that must be addressed immediately. At the most basic level, a solution is two-fold: (1) limit the way the technology can be used, and (2) improve the technology itself.
Debbie reminds us that at the end of the day, we are the ones in control of the technology – not the other way around. Humans must be making the final decisions when it comes to scenarios where algorithms can have such a deleterious impact on individual lives. She put this into context with a lighthearted anecdote that harkens back to the days where paper maps you purchased from the drug store or gas station guided our driving efforts, not GPS. Maps couldn’t cause too much harm on their own, it was all user error. But if your GPS directs you to drive into a lake, are you going to blindly follow the technology into that watery abyss?
CONSIDER AND COMMENT: What are other ways in which technology impact issues of race and gender. Feel free to comment with your thoughts, ideas and experience.
To further explore the subject of racial and gender bias in facial recognition algorithms, take a few minutes and watch this eye-opening video Gender Shades by Joy Buolamwini: http://gendershades.org/. Also, check out her TED Talk on the subject: https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms?utm_campaign=tedspread&utm_medium=referral&utm_source=tedcomshare
As we near the end of our discussion, I would like to turn to another IoT topic – The Internet of Toys. I remember all the fervor that Mattel’s Hello Barbie caused back in 2015. How far have we come with regards to connected toys and what are the specific privacy issues with regards to children in this context?
I start the conversation on Internet of Toys by highlighting a personal anecdote about this pretty amazing IoT device call Moxie that I almost bought for my 4-year-old. However, right before the pre-order, I opted out. Check out the original advertisement for Moxie here: https://www.youtube.com/watch?v=7YRNjclHTHg
Debbie follows up my comment on Moxie by highlighting a connected toy that isn’t sold in Illinois because of manufacturer concerns over BIPA. We are far beyond the days of the Easy Bake Oven and Baby Alive, and now we face toys that could easily introduce a hacker into the environments where our children play.
In the U.S., there are efforts to strengthen data privacy laws as they relate to children. For example, the CPRA, which passed in November 2020 and takes effect in 2023, triples the fines for select privacy violations impacting certain personal information of children.
CONSIDER AND COMMENT: To close out our discussion on the intersection between children, data privacy and the Internet of Toys, we consider the division of responsibility between parents and developers/manufacturers. Debbie highlights FaceApp’s intense Terms of Service and the implications such apps can have for kids. Have we reached the right balance between parental consent and corporate responsibility or is there more to do? What are the limits on parental consent in this context and should those limits change? Feel free to comment with your thoughts, ideas and experience.
What are three items in your Data Privacy Survival Kit?
- The Right to Be Left Alone! MIC DROP! Debbie adds this to her Data Privacy Survival Kit because it’s the practical version of her most desired superpower: invisibility.
- Don’t be so excited to get the next new thing. Be skeptical and smart. Never be the first one to rush into new technology.
- Get involved! Manage yourself, have conversations with your circle, have that difficult talk with your parents about 2FA, be of service to those around you as best you can and help them with emerging technology issues. Be a guide for your network.
SURVIVAL TIP! The most important take away is that third item in Debbie’s Data Privacy Survival Kit – GET INVOLVED. The best way to survive in the electronic wilderness is to learn. Getting involved in a major part of learning, as well as both gaining and maintaining technology competence.
You are on the frontier, the forward edge of the data privacy landscape, and you do so much in the business, academic and advisory realms. What is next for The Data Diva? What is the one thing or project in 2021 that you’re most excited about?
Debbie is involved in a great deal of advisory work and loves to focus on being proactive in the space, rather than react to issues.
She then BLOWS OUR MIND by highlighting her newest endeavor, which combines many areas of her expertise. Debbie is currently working with XRSI to create a data privacy framework for developers in the Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) spaces. This is an international endeavor involving a huge team of experts from a wide array of industries and sectors, including entertainment, education, medical, creatives, and more (it includes at least one fellow from NASA too, which is pretty awesome). Debbie is running the compliance component of the project. Now THIS is about as visionary and forward learning as you get! For more information about XRSI, follow this link: https://xrsi.org/publication/the-xrsi-privacy-framework
And just when you think it cannot get any more mind-blowing, Debbie drops on us the fact that developers are creating employee training simulators where participants role play through an avatar. Just imagine the new HR tasks that will emerge when virtual reality sexual harassment training programs start to crop up! But any use of technology that helps to support a safer, more productive, more equitable, more diverse and more inclusive environment is a very good and very necessary development.
Debbie concludes by emphasizing a current use of VR technology that hits home for many of us in the legal profession – The Virtual Reality Conference. She recently attended one such conference where she was able to explore the virtual “campus” using an avatar. The ways in which people have used technology to engage people during the pandemic have demonstrated amazing levels of creativity.
After some final parting words, we say our goodbyes.
We cannot wait to have Debbie back on The Guide!
Thanks so much Debbie! There is so much more that we could talk about and I definitely hope you will come back and join us for future discussions.
Everyone! If you are interested in ANYTHING related to data privacy or if you are facing any data privacy issues, you must go to www.debbiereynoldsconsulting.com or visit the Debbie Reynolds Consulting LLC YouTube channel here: https://www.youtube.com/channel/UCVZ2nIE9bw43aH1QZVJh2UQ
Whether you watched the full video, viewed the snippets or read the blog, we cannot thank you enough for your support and interest!
This is Matt from ESI Survival Guide telling you to please Stay Safe in the Electronic Wilderness. See you next time!