More stories

  • in

    How to make privacy your company's 'killer app'

    For years, privacy was a largely theoretical concern for technology companies. If asked directly, consumers would express concern about how companies gathered, stored, and shared their personal data. But those same consumers would also happily use products that harvested and sold their data, or had lax controls that resulted in their personal data being stolen. Perhaps it was a case of wilful ignorance or a demonstration that privacy wasn’t all that important after all.
    Privacy gets real

    A confluence of factors has elevated privacy as a concern among consumers, and it appears to be a trend that’s gaining ground. Large-scale data breaches are regular news, as are the emails or letters notifying consumers that organizations, from the federal government to credit agencies, have been hacked and leaked personal information. According to Pew Research from November 2019, just 6% of Americans believe their data are more secure today than in the past.
    This trend is combined with leading social media and consumer companies being called to account for their data collection. While consumers were once happy to ignore the fact that these ‘free’ apps were actually using them as a product, many are now painfully aware that not only is their data being relentlessly harvested, but the content they enjoyed could be shaped by anyone from a company hoping to sell them something to a foreign government with more nefarious purposes. Initiatives like the EU’s GDPR highlighted that there might be more to the privacy argument than merely trusting app makers to do the right thing. A 2019 Freckle consumer study showed almost half (46%) of the sampled group of 1,200 consumers don’t trust Facebook with their data.
    3 ways to make privacy a priority
    Apple was one of the first large tech companies to identify privacy as a potential differentiator, with high-profile announcements that its phones were so secure it couldn’t even unlock their data for law enforcement. Picking fights with government law enforcement agencies is probably not something most companies are prepared to do, but privacy can be a differentiating feature in an increasingly competitive market. Consider the following three ways to boost your next product’s privacy credentials.
    1. Challenge the ‘monetize data’ bromide
    If you’ve ever participated in a technology product or platform discussion, invariably it will be suggested that ‘monetizing data’ should be a key revenue generator from the product in question. This is a fancy way of suggesting that the product should rigorously harvest users’ data, which will then be sold to the highest bidder. Unfortunately, this attitude will permeate every aspect of the product. A notorious recent example is the popular video conferencing tool Zoom suggesting that it would capture users’ physical location and the content of their conferences and use it for advertising.

    Many consumers are now wary of being monetized, and the damage to your brand could cost significantly more than any revenue gained from selling users’ data. Furthermore, you’re unlikely to gather new data that companies like Acxiom don’t already have. Even more insidious, ‘monetize data’ is often a way of saying “I have no idea in hell how we’re going to make money with this thing.'”
    2. Make privacy a feature
    In a universe of organizations attempting to gather every possible nugget of data, explore what would happen if you took the opposite tack. What if your app didn’t even require an email address or login to use it? What if you created tools that enabled privacy rather than attempted to track every movement? What if you worked with your human resources department to create a privacy-focused application process that addressed emerging concerns around racism in the hiring process? What if your company website had a nice little footer explaining that you didn’t track visitors’ every click rather than an obnoxious “WE USE COOKIES” pop-up, and your ‘Terms and Conditions’ were a paragraph rather than 68 pages?
    If consumer demands around privacy are changing, rather than seeking to exploit more of your customers’ data, it’s worth at least considering how you might do the opposite.
    3. Toot your privacy horn
    When you start to regard privacy as a ‘killer app’, and build it into your applications and processes from the get-go, you need to publicize how you’ve differentiated your products. Yet another platform sucking down data is unlikely to get much notice, but a privacy-first social network might be interesting enough to get some free marketing from the press. Apple quite literally shouted from the rooftops about its shift to privacy-focused products, placing billboards touting the company’s privacy features at key industry conferences.
    Consumer preferences around privacy have clearly shifted, yet most companies seem to be begrudgingly doing the bare minimum to comply with policy and preferences. Why not set your products and even your company practices apart by considering privacy as a critical feature, rather than an afterthought?
    Also see More

  • in

    Survey: Companies struggle to implement data privacy initiatives

    Despite the common refrain of “it’s not a matter of if, but when” in relation to dealing with a privacy breach, companies are still struggling to implement data privacy protocols, according to a recent TechRepublic Premium survey. 
    Of the 186 professionals surveyed in July 2020, 37% said that their company did not have a dedicated privacy team, while 44% said their company’s privacy team had one to five employees. Only 6% of respondents claimed 10 or more members on their company’s privacy team.
    SEE: Report: SMBs unprepared to tackle data privacy (TechRepublic Premium)
    Barriers to data privacy

    Other barriers to data privacy ranged from corporate culture (37%), lack of knowledge (35%), financial cost (33%) or lack of resources (33%), integration with existing tools (28%), and lack of either technical skills (25%) or leadership (24%). 
    Other respondents cited the complexity of GDPR (18%), lack of available technology (8%), and a business model that relies on user surveillance (8%) as challenges to enabling data privacy. 

    The General Data Protection Regulation (GDPR), a set of regulations designed to protect the data security and privacy of all EU citizens and any business entity that transacts with them, went into effect May 25, 2018. Yet 16% of applicable respondents admitted that their organizations were not meeting requirements, 16% were still in the process of meeting requirements, or they were unsure (26%) about their company’s compliance. Among respondents, 35% were meeting all GDPR requirements.
    When it comes to the California Consumer Privacy Act (CCPA), a state statute intended to enhance privacy rights and consumer protection specifically for California residents, 26% of applicable respondents were meeting or are in the process of meeting all requirements, 14% were not meeting requirements, and 28% were unsure of their company’s compliance. 
    SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download (TechRepublic Premium)
    A wide range of tools are available to help companies carry out their data privacy initiatives. The majority of respondents are implementing or considering data backup/recovery solutions (62%). More than half of respondents use or are considering endpoint protection (54%), data loss prevention (52%), and encryption software (52%). Close to half of the respondents (48%) use or may use Identity and Access Management (IAM) or (43%) Mobile Device Management (MDM). Other tools being used or considered are compliance software (30%), Customer Data Management (CDM) platforms (19%), and consent management applications (16%).
    Who is responsible for protecting data privacy?
    The majority of survey respondents (51%) reported that IT is responsible for their organization’s data privacy. Further, the privacy leader within the respondents’ organizations ranged from the chief information officer (CIO)/chief technology officer (CTO) at 21%, data protection officer (DPO) at 16%, chief information security officer (CISO) at 11%, chief privacy officer (CPO) at 8%, and general counsel/chief counsel/chief legal officer (CLO) at 5%. In addition, 19% of respondents were unsure who their privacy leader was, 16% said ‘other’, and 5% said their organization was in the process of creating a position for this task. 
    To read more findings, plus analysis, download the full report: Report: SMBs unprepared to tackle data privacy (available for TechRepublic Premium subscribers).
    Also see More

  • in

    An in-depth look at how the enterprise is navigating data privacy

    Thanks to social media, wearable devices, and media streaming platforms, companies are collecting more data about us than ever before. While this data gathering continues to grow, so does public unease about data collection and data usage. 

    Recent Special Features

    Companies are now tasked with balancing their data needs with the public’s demand for data privacy. How can they accomplish this? ZDNet and TechRepublic published: Navigating data privacy (free PDF) to find out.
    Here’s a look at what’s in this free PDF ebook.
    Rising concerns about privacy present an opportunity for savvy companies. Learn more in TechRepublic contributor Patrick Gray’s feature “How to make privacy your company’s ‘killer app.'”
    In “Data privacy and data security are not the same,” TechRepublic contributor Allen Bernard researches how companies that fail to understand the differences between data privacy and data security put their brands and bottom lines in jeopardy.

    Everything you do online leaves a trace — in more ways than you may realize. TechRepublic contributor Brandon Vigliarolo investigates in his feature, “Personally identifiable information (PII): What it is, how it’s used, and how to protect it.”
    SEE: Navigating data privacy (free PDF) (TechRepublic)
    In the article “How new apps protect the health and privacy of employees,” TechRepublic’s Veronica Combs reports on how employers manage to protect employees’ health and preserve their privacy as workplaces reopen during the coronavirus pandemic.
    Also in this ebook, ZDNet contributor Eileen Yu reports in “Singapore looks to ease privacy fears with ‘no internet’ wearable device,” about a wearable device the Singapore government is developing, and ZDnet’s Daphne Leprince-Ringuet investigates why Norway has put its contact tracing app on hold in the article, “Contact tracing: Now Norway suspends use of its app, citing privacy fears.” 
    Two years ago, the General Data Protection Regulation (GDPR) launched, however, the data privacy law still faces challenges across the European Union. ZDNet’s Danny Palmer investigates in, “GDPR two years on: Why there’s still work to be done on data protection.”
    As concerns about online privacy continue to escalate, TechRepublic contributor Mary Shacklett addresses the key issues every privacy policy needs in the feature, “How to create a privacy policy that protects your company and your customers.”
    To read all these articles, plus details on original research from ZDNet sister site TechRepublic Premium, download the ebook: Navigating data privacy (free PDF).
    Also see More

  • in

    Data privacy and data security are not the same

    Ever since the September 2017 Equifax data breach that exposed the personal information of 147 million Americans, and the many other high-profile data breaches that have happened since, data security and data privacy have become pressing boardroom-level concerns.

    “The Equifax debacle is where a lot of the inherent [cybersecurity] issues really surfaced to the business level,” said Aaron Shum, practice lead, Security, Privacy, Risk, and Compliance, at Info-Tech Research Group. “It’s where we discovered the level of incompetence that can exist in an organization.”
    According to the 2019 Edelman Trust Barometer Special Report: In Brands We Trust?, 81% of consumers said that brand trustworthiness plays a major role in their buying decisions. In other words, data breaches today not only represent a bottom-line risk in the form of penalties, but they also jeopardize an organization’s brand and reputation, directly impacting its ability to attract new customers and retain existing ones.
    “Businesses need to treat privacy as both a compliance and business risk issue to reduce regulatory sanctions and commercial impacts such as reputational damage and consequential loss of customers due to privacy breaches,” said   Steve Durbin, managing director of the Information Security Forum in the UK.
    More than semantics
    For many outside of the infosec community, the terms ‘data security’ and ‘data privacy’ are often used interchangeably. In reality, even though they share a common goal, they are not the same, said Greg Ewing, cybersecurity partner at Potomac Law. 

    “The difference between data privacy and data security is the difference between protecting someone’s personal information and the security measures you have in place to protect all of your business’ information,” he said. 

    With regulations like the California Consumer Privacy Act (CCPA) and the EU’s General Data Protection Regulation (GDPR) now in effect, this distinction is more than a matter of semantics. The GDPR, for example, imposes serious financial penalties that can range into the billions of dollars for data breaches involving personally identifiable information (PII) of EU citizens. At between $2,500 and $7,500 per PII record, non-compliance penalties under the CCPA can add up quickly, as well.
    With the COVID-19 pandemic showing no signs of abating, more people are spending more time online than ever. The massive shift in online usage both pre- and post-COVID-19, combined with the general distrust of how large social media and entertainment companies monetize customer data, is not going unnoticed by state regulators. According to the National Conference of State Legislators, privacy bills are now under consideration in 30 states. 
    “Data privacy is, in essence, a subset of an organization’s data security,” Ewing said. “The distinction is important because, although the tools used to maintain data privacy and to ensure data security may overlap, the two are generally addressed differently by different teams using different tools.”
    This overlap can cause confusion, leaving companies who focus just on data security with the false impression that, by default, data privacy also is protected. This is not the case. Unlike data security, which focuses on protecting all of an organization’s data from theft or corruption (like during a ransomware attack), data privacy is more granular. To ensure data privacy, organizations must understand, track, and control things like who is authorized to access the data and where the data is stored — in a Health Insurance Portability and Accountability Act (HIPAA)-compliant cloud, for example.
    A good example of differences between data privacy and data security was the harvesting of 87 million Facebook user profiles by the now-defunct political consulting firm Cambridge Analytica during the 2016-17 US presidential election, said Joshua Kail, a communications consultant who ran agency-side PR for Cambridge Analytica until it shut down in May 2018. Even though the data was secure, Facebook abused its own privacy policy and a 2011 FTC consent decree regarding the use of user data.
    “It was a strange instance of a failure of data security from [Facebook’s] perspective in that they basically handed the data over and then it was used in an inappropriate way, rather than a traditional malicious cyberattack,” he said. “As far as data privacy is concerned, we all lost that the moment we signed up with an account. Really, it wasn’t ‘our data’ that was used by Cambridge, it was Facebook’s data about us. This distinction is where the real danger in current data policies lives.”
    Kail recently appeared on Bill Detwiler’s TechRepublic Dynamic Developer podcast discussing data privacy and data rights.
    A matter of trust
    While data privacy is becoming more regulated every year, it is still a matter that, today, largely comes down to trust, said Kayne McGladrey, a cybersecurity strategist at Ascent Solutions. As the backlash in the wake of the Cambridge Analytica scandal shows, what people expect from the companies they do business with is just as important as the laws that govern the use of their data. 
    “Today’s data privacy is primarily concerned with the processing of personal data based on laws, regulations, and social norms,” McGladrey said. “Often this is represented by a consumer ignoring an incomprehensible privacy policy (that would take nearly 20 minutes to read) before clicking a button to acknowledge their consent to that policy. Their acceptance of the policy allows the organization to handle their data in documented ways, such as using it to show them targeted advertising based on their inferred interests. However, if that organization sold those personal data to another organization to do something unexpected (like using it to suppress protected free speech) without the consumer’s consent, that would be a breach of privacy, either by regulatory control or by a violation of social norms.”
    Given all that has happened in the past few years — the constant drumbeat of massive data breaches and the ever-escalating cyber attacks on businesses and individuals — it is not surprising that people feel their data is no longer safe in the hands of the companies they do business with, or the governments that mandate its collection. 
    Because of this mistrust, the imperative for businesses to get out in front of these issues could not be greater, said Lili Ana, Information Security Governance manager at loanDepot.
    “As the hacking industry rapidly grows and cybercriminals become more well-funded, and as the global transformation of digital at-home workplaces continues to be the new normal, companies must take action to understand information security and how data privacy and data security work together to protect businesses and consumers,” Ana said. “Investing in safeguarding your business in a proactive approach is far less costly than the alternative, which is a data incident or breach that not only can destroy a business but can ruin reputation, credibility, and consumer trust.”
    Also See More

  • in

    Personally identifiable information (PII): What it is, how it's used, and how to protect it

    No matter what kind of device you’re using or what you’re doing on it, data is constantly being created that can be traced back to you.
    Personally identifiable information (PII) comes in many forms, and in many cases is created without you even realizing it. That data can be used to learn things about you, your habits, your interests, and can be monetized or used by malicious actors to steal your identity or hack your accounts.
    Knowing what PII is, what it’s used for, and how to protect it are all essential parts of staying safe online. 
    SEE: Zero trust security: A cheat sheet (free PDF) (TechRepublic)
    What PII is, and what it isn’t

    Multi-factor authentication provider Okta, in its 2020 Cost of Privacy report, lists 13 distinct categories of data that can be considered PII :
    Usernames and passwords
    Emails and sent messages
    Data entered into online forms
    Online profiles
    Internet history
    Physical location when online
    Online purchase history
    Search history
    Social media posts
    Devices used 
    Work done online
    Online videos watched
    Online music, playlists 
    The Okta report lists those categories in descending order (as seen above) to show how aware survey respondents were that those types of data were PII. By the time you reach ‘physical location when online’, less than half of respondents realized that type of data could be used to identify an internet user.
    SEE: SSL Certificate Best Practices Policy (TechRepublic Premium)
    The US National Institute of Standards and Technology (NIST) defines PII fairly broadly as “any information about an individual maintained by an agency, including any information that can be used to distinguish or trace an individual’s identity, such as name, social security number, date and place of birth, mother’s maiden name, or biometric records; and any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.” 

    That definition breaks PII into two categories: linked data, which is data directly connected to a person; and linkable data, which is not directly associated with a person’s identification but can be used to connect to them with a bit of work.
    NIST’s definition of PII goes beyond online data and includes paper documents, ID cards, bills, bank statements, and other records. In the case of online data, much of it falls into what NIST calls ‘linkable’ data, especially if that data is anonymized or doesn’t contain data about you as a person, like some tracking cookies, IP addresses, and machine IDs.
    Some of the more ambiguous forms of PII, like IP addresses, have been argued both ways and nothing clear has emerged from more than a decade of debate over whether they can be used to identify someone.
    In 2009, the Johnson v. Microsoft decision found that IP addresses were not PII because IP addresses identify a computer, not a person. This conflicts with a 2008 court case in New Jersey, which held that customers had a reasonable expectation of privacy in regards to IP addresses. It also conflicts with guidance from NIST that describes IP addresses as PII.
    Ambiguous data comes in many forms, like website tracking data, cookies, advertising profiles, and other information that can be kept separated from more easily linked PII, but can be combined by the companies that operate those services. In 2016, Google amended its privacy policy (which has since been changed) to allow it to connect cookie information to PII for the sake of “improving Google’s services.” 
    SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download (TechRepublic Premium)
    How PII is used
    PII is used in both legitimate and illegitimate ways. A user’s browsing history, cookies served by websites, and search history are often used to serve targeted advertisements, which is why social media advertisements can be so oddly specific.
    It’s illegitimate uses of PII that garner more interest, and should be of greater concern, for internet users. Yes, targeted ads and the privacy violations that have been committed in service to them are a problem, but the fallout from a cybercriminal gaining access to your PII can be far worse. 
    PII leaks were the leading type of data breaches in 2018 because of how valuable that data is: With one bit of information an attacker can hone in on an individual target for a phishing attack, use that data to search for additional information about a person, or use it to break directly into an online account.
    PII can also be used to launch social engineering attacks, which are one of the most popular hacking methods currently in use: Why go through the work of developing a complicated hack when you can simply use PII stolen in a breach and some social media posts to guess your way into someone’s account? 
    SEE: VPN usage policy (TechRepublic Premium)
    Protecting your PII
    It can be tough to protect your PII, especially since so much of it is collected in the background by websites and services you use everyday. In other cases, websites you trust with more sensitive PII like your name, address, email address, and banking information, can be breached and there’s nothing you can do about it.
    That doesn’t mean you’re completely unable to protect your PII, though. There are many precautions you can take to minimize your PII footprint and protect your information when you absolutely have to provide it.
    Identity theft protection provider NortonLifeLock recommends the following PII protection steps:
    Be careful what you post on social media: It’s easy to guess password hints and other personal info from posts. When possible, limit your social media audience to people you know.
    Invest in a paper shredder to protect physical PII.
    Don’t just hand over sensitive info like your social security number when asked — find out why it’s needed and how it will be protected first.
    Leave sensitive documents, like your social security card and passport, at home unless you need them.
    Other steps include using an intermediary payment service like PayPal or Privacy.com instead of giving vendors your credit card or banking information. Users can also find out how web browsers can block tracking cookies and enable the do not track mode (not always effective), or install a browser add-on like Ghostery that allows users to block individual elements that may be tracking or harvesting data.
    In addition, you should regularly clear your browser history, cookies, and other temporary files that contain PII, use a VPN when handling sensitive information or browsing the web on an unsecured public wi-fi network, and use incognito mode on your web browser to prevent tracking and storage of records tied to your identity.
    Also see More

  • in

    BlackBerry releases new security tool for reverse-engineering PE files

    Image: BlackBerry

    Today, at the Black Hat USA 2020 security conference, BlackBerry released a new tool for the cyber-security community.
    Named PE Tree, this is a new Python-based app for Linux, Mac, and Windows that can be used to reverse-engineer and analyze the internal structure of Portable Executable (PE) files — a common file that malware authors have used to hide malicious payloads.
    The tool has been open-sourced on GitHub since last week, but today marks its official release.
    “Reverse engineering of malware is an extremely time- and labor-intensive process, which can involve hours of disassembling and sometimes deconstructing a software program,” the company said in a press release today.
    “The BlackBerry Research and Intelligence team initially developed this open source tool for internal use and is now making it available to the malware reverse engineering community,” it added.

    According to BlackBerry, PE Tree’s benefits include:
    Listing PE file content in an easy-to-navigate tree view
    Integration with the IDA Pro decompiler (easy navigation of PE structures, dumping in-memory PE files, performing import reconstruction)
    VirusTotal search integration
    Can send data to CyberChef
    Can run as either a standalone application or an IDAPython plugin
    Open source license allows community contributions
    The tool is an alternative to PE-bear, a similar app developed by Malwarebytes malware analyst Aleksandra “Hasherezade” Doniec.
    Cyber-security vendors embracing the open-source space
    PE Tree also marks the release of yet another useful cyber-security tool into the open source space. This is a major change in approach for cyber-security firms, which have historically kept their internal tools out of the public eye, or closed-source and under expensive commercial licenses.
    Over the past two years, we’ve seen:
    FireEye release CommandoVM, a Windows-based virtual machine specifically built for malware research, as an alternative to Kali Linux, the community’s favorite OS.
    FireEye release Flashmingo, an app to automatically search for Flash vulnerabilities.
    FireEye release Crescendo, a real-time event viewer for macOS.
    FireEye release StringSifter, a machine learning tool that automatically ranks strings based on their relevance for malware analysis.
    FireEye release SharPersist, a red-team utility for establishing persistence on Windows using different techniques.
    FireEye release Capa, a tool that can analyze malware and detect malicious capabilities.
    FireEye release SilkETW, a tool for collecting and searching Event Tracing for Windows (ETW) logs.
    CERT-Poland release DRAKVUF, an automated hypervisor-level malware analysis system/sandbox.
    CyberArk release SkyWrapper, a tool that can scan AWS infrastructure and detect if hackers have abused self-replicating tokens to maintain access to compromised systems.
    CyberArk release SkyArk, a tool to detect shadow admin accounts in AWS and Azure environments.
    F-Secure release TamaGo, a Go-based firmware for bare metal ARM System-on-Chip (SoC) components.
    F-Secure release Jandroid, a tool to identify potential logic bug exploit chains on Android.
    F-Secure release C3, an open source tool for building custom command-and-control servers.
    SEC Consult release SEC Xtractor, a tool for hardware exploitation and firmware extraction.
    NCC Group release Sniffle, the world’s first open source sniffer for Bluetooth 5.
    NCC Group release Phantom Tap (PhanTap), a tool for silently intercepting network traffic.
    NCC Group releases WStalker, a proxy to support the testing of web API calls.
    Google release Tsunami, a vulnerability scanner for large-scale enterprise networks.
    Google release UKIP, a tool to prevent USB keystroke injection attacks on Linux.
    Google release the Sandboxed API, a project for sandboxing C/C++ libraries on Linux.
    Cloudflare release Flan Scan, a network vulnerability scanner.
    Red Canary release Chain Reactor, a tool for adversary simulations on Linux systems.
    SpecterOps release Satellite, a payload and proxy service for red team operations.
    Trustwave release SCShell, a tool for fileless lateral movement that relies on Service Manager.
    Trustwave release CrackQ, a tool for managing hashcat password-cracking jobs in a queuing system.
    France’s ANSSI cyber-security agency release DFIR ORC, an open-source forensics tool dedicated to artifact collection from Windows systems.
    Sophos release Sandboxie, a user-friendly app to let users sandbox (isolate) dangerous apps inside their own limited container.
    The NSA release Ghidra, a complete software reverse-engineering toolkit.
    Intel release HBFA, an app to help with firmware security testing. More

  • in

    How new apps protect the health and privacy of employees

    Employers have to strike the right balance between protecting employees’ health and preserving their privacy as workplaces reopen during the coronavirus pandemic. Some companies are using apps to conduct health screenings, manage access to the office, and monitor social distancing. While this information will help manage the risk of contracting the coronavirus, it also introduces new privacy concerns.
    Here are two examples of how new apps are helping companies protect the health and privacy of their employees  as well as recommendations about how to build these services with security in mind. 
    Using daily health screenings to limit risk

    The Centers for Disease Control and Prevention (CDC) does not want people to gather in big groups for good reason: That’s the fastest way to spread COVID-19 to many people at the same time. However, it’s hard to conduct a class or run a warehouse without bringing groups of people together.
    Some universities and businesses are even asking students and employees to fill out health questionnaires at the start of the day to reduce this risk. LiveSafe, a company that started seven years ago with a safety focus, has built WorkSafe. This specialized version of the original app can power health screenings and manage other health and safety risks caused by COVID-19. WorkSafe can be used to send daily health questionnaires to screen for symptoms before a person arrives on campus or at the front lobby. 
    “The platform shows who responded to the health check-in and who has not, and responses are time-stamped and recorded,” said Carolyn Parent, CEO and president of LiveSafe. “Once you do come into the office or the campus, you can report things, like there’s no hand sanitizer, or we need more masks, or the elevators are crowded.”

    The app does not store or track location, and users have the option to share information anonymously. The app can also provide alerts triggered by location.
    LiveSafe works with banks and healthcare companies and helps clients in those industries comply with data privacy rules.
    “We host everything in AWS in the cloud,” said Parent, who added that everyone should be patient as institutions refine these protocols for the new normal.
    “Everybody is trying to figure this out, and we’re all going to have to have a lot of agility,” she said.
    The LiveSafe platform allows employees, students, and other community members to report safety issues to security officials. Users can send text, pictures, and video through the app, and the information is routed to the appropriate department, such as public safety or facilities management. Universities and corporations can use the app and dashboard combination to manage these reports and to push out information.
    Austin Peay State University has used the LiveSafe app for several years. Students and staff at the small university in Tennessee can send tips via a personal profile or anonymously and call or text about emergencies. The app also houses the university’s emergency guide, maps, and directions to buildings. The school included the WorkSafe option to the LiveSafe app and added links to the school’s COVID-19 website. 
    Michael J. Kasitz, assistant vice president for public safety at Austin Peay State University, said that only people in his department have access to data collected from the app. 
    “Users are able to share their locations with us, but can turn off that feature at any time,” he said.  “We are unable to turn on location services for any of our users.”
    Using proximity monitoring for contact tracing
    Another way to reduce the risk of COVID-19 transmission at work is to enable a contact tracing system. PwC is helping its clients do this via the company’s office safety service Check-In, which launched in July 2020 with about 50 customers. 
    The service monitors ambient signals — the constant background field of 2.4- and 5-gigahertz radio signals — in an office environment to understand how and when employees interact. The technology then decodes the signals and analyzes how these signals interact. 
    The goal for this enterprise-level contact tracing system is to make it easier for human resources departments to conduct contact tracing and use the information to reduce the risk of coronavirus cases spreading within a company.
    However, the platform doesn’t notify anyone that they may have been in proximity to someone with a positive test.  “That’s up to the internal policies and procedures of our clients,” said Rob Mesirow, the principal for IoT services at PwC.
    Mesirow’s team has tested the algorithm in several settings, including office buildings and warehouses. The service uses Bluetooth Low Energy (BLE) to power proximity sensors and monitor the movements of people in a space. 
    During beta testing, Mesirow’s team fine-tuned the algorithm that uses the proximity data to determine the risk of transmitting COVID-19 from one person to another.  
    “If I was in an office, and you were two doors down, and I called in to report I had a positive test, we needed to know that there were two doors and three walls between us,” he said. “We were testing for the right proximity matrix.”
    The algorithm that analyzes the proximity data assigns a high, medium, or low proximity score to help understand exposure. As the Centers for Disease Control guidelines continue to evolve, Mesirow’s team will adjust the parameters to reflect the new information.
    “For example, the CDC first said you have a higher probability of contracting the virus within 30 minutes of exposure, but then it changed to 15 minutes,” he said. PwC offices have not reopened, but the service was tested in the company’s Shanghai office.
    Mesirow said that most clients are making the app a requirement for employees returning to the workplace and are promoting it as a positive way to protect the health of colleagues and family members.
    “It’s hard to find a negative in contact tracing,” he said.
    Best practices for building secure apps
    Despite the pressure to reopen workplaces quickly, developers should take the time to address data security issues early on in the process of creating these apps. Tim Mackey, principal security strategist at Synopsys CyRC, said that data requested by an app should serve a specific, current requirement, not in anticipation of any other use. 
    “With a new app, the decisions surrounding the processing and storage of any data collected will need to be made, which means that key questions surrounding the secure processing and storage of the data have yet to be answered,” he said. “Answering these questions at the design or initial implementation phase is the least costly time to apply security practices.”
    Mackey added that going slightly slower at this phase can result in a quicker time to market as the development teams aren’t needing to reimplement poor designs, patch applications, or re-secure retained data.
    The mobile application protection company Guardsquare found that most contact-tracing apps don’t employ sufficient hardening techniques. The company analyzed 17 Android mobile contact tracing apps from 17 different countries. All of these apps were built by government entities, sometimes by hired contractors.
    Only one app that the researchers analyzed was fully obfuscated and encrypted. The other findings were:
    41% have root detection
    41% include some level of name obfuscation
    29% include string encryption
    18% include emulator detection
    6% include asset/resource encryption
    6% include class encryption
    Guardsquare recommends that mobile app developers use a layered approach to security, including code hardening to protect code at rest and runtime application self-protection to protect apps in use. The best practices for app security include root detection, emulator detection, hook detection, tamper detection, and debugger detection.
    Also See More

  • in

    How to create a privacy policy that protects your company and your customers

    Under privacy law, a privacy policy is a statement or legal document that discloses some or all of the ways a party gathers, uses, discloses, and manages a customer’s or client’s data. Typically, companies share this customer/client data with their third-party business partners. By annually informing customers/clients via mailed notices of company privacy practices concerning the collection and the distribution of customer/client data that is under company management, companies fulfil a legal requirement to protect a customer’s or client’s privacy. For example, here is Google’s privacy policy.

    On an ongoing basis, data stewards within the organization, principally IT, are responsible for keeping corporate data secure and private.
    Collectively, information privacy policies are important to IT, compliance officers, and others in the business because if customers/clients inform the company that they do not want their personal information collected or shared, companies must abide by these decisions; data on these individuals can’t be sold or distributed to others.
    In organizations where customer/client data is extremely sensitive, such as in insurance, financial services, and healthcare, workers must practice privacy protections so that information is not inadvertently shared.
    How to use these policy guidelines
    How you develop and maintain your privacy policy will vary depending upon your business, your customers, and the industry vertical you are in. The guidelines below are broken into general categories you should take into account in your due diligence as you build your privacy policy. Depending on your business application, the key points within each topic will have different degrees of importance for you. Focus on those guidelines that are directly relevant to your business model as you formulate a policy that meets your company’s circumstances, but be sure to review the other topics so you don’t overlook another relevant area.
    Who should be involved

    A privacy policy is an internal matter that concerns employee conduct with sensitive information, but it also has significant impact and ramifications for your outside stakeholders, whether they are your board of directors and investors, your third-party business partners, or your customers. Therefore, to thoroughly cover all areas of privacy, an interdisciplinary team should work together in policy development. This team should include:
    IT
    The data steward of corporate information
    Compliance
    The administrative arm of the company that ensures that the company is current and compliant with privacy regulatory guidelines
    Legal staff, which is current on legislated law and on recent privacy case law and should always provide input into and perform due diligence on privacy drafts or revisions before they are enacted
    Third-party business partners who might want to use your customer information for marketing or research but must understand the limits of the information you can give them
    Adjunct staff business functions/contractors who need to access sensitive information because it directly affects their ability to do their jobs (e.g., a ‘guest’ surgeon requires access to a patient’s medical history in preparing for a delicate operation).
    Items to cover in a privacy policy
    Privacy is an issue that overlaps the legal/compliance, marketing/public relations, and IT functions to a degree where many elements must be addressed by cross-disciplinary teams. The elements that a privacy policy should address include:
    Commitments to customers/stakeholders
    How customer information is collected and used
    How customer information is shared
    How customer account activity is tracked
    How customer information is provided to third parties
    Data protection and security
    Opt-in or opt-out choices that customers can make with respect to their information
    Customer privacy rights
    Company contact information for customers with questions about privacy
    Login information
    Privacy compliance
    Employee privacy practices
    Data retention
    These elements can be grouped into two general categories:
    Communications and marketing
    Legal, compliance, and IT.
    Communications and marketing
    Commitments to customers/stakeholders
    The privacy policy statement that companies issue to their customers should begin with a position statement from the company on how it is going to protect customer information. Many companies use this beginning point of the policy to explain to customers that their data will be encrypted and kept safe and secure — and that the data will not be sold to others. The company also states that the privacy policy and access to it will always be available to customers and that any time there is a policy change, customers will be notified.
    How customer information is collected and used
    The privacy policy issued to customers should explain how the company plans to use customer information (e.g., to improve products) and what customer information the company plans to collect for this purpose (customer account information, browsing history, etc.). If the company plans to use/collect the location information or personal information that resides on users’ local devices, this should also be disclosed.
    How customer information is shared
    The company privacy policy should tell customers of any organizations the company plans to share its customer information with. Typically, these are affiliates or third-party business partners of the company that it feels would be a value-add for the customers.
    Opt-in or opt-out choices that customers can make with respect to their information
    The privacy policy should explain to customers what their opt-in or opt-out choices are for maintaining privacy of their information. For example, companies might give customers an opportunity to opt in (or out) for offers from advertisers or third-party business partners or to decline anonymizing their customer data for purposes of analytics reporting.
    Customer privacy rights
    Customers should be informed of their privacy rights under law. For example, they might have a right to request information concerning whether the company has disclosed personal information to any third parties, and to which third parties, for marketing purposes or whether the company has sold any of their personal information without their consent.
    Company contact information for customers with questions about privacy
    The company should always furnish customers with an email address, a telephone number, and a physical address so that customers can contact it with any questions or feedback about the privacy policy.
    Legal, compliance, and IT
    How customer account activity is tracked
    Companies often use cookies to track which websites users are coming from and which websites they are going to after they’ve visited the company website. In addition, usage activities can be tracked on the company website itself. How those cookies are used to track user activities should be explained in the privacy policy, along with the fact that users can de-implement cookie tracking if they choose to. However, before a policy is published out to users, legal, compliance, marketing, and IT should define which user activity patterns are to be tracked and how tracking information is to be used.
    How customer information is provided to third parties
    Internally, legal, compliance, and IT should develop policies and standards that govern how customer information will be provided to third parties and what privacy protections will be implemented. In co-marketing efforts where the customer is informed and can opt out of sharing personal information, the company might share direct customer information and contact information with business partners. In other cases, such as data analytics information offered for sale, the company might be required to anonymize individual customer contacts and information so that data can’t be traced back to individuals.
    Data protection and security
    Security measures, secure storage, and protection of data for purposes of privacy should be defined as a policy and as procedures that are activated in IT, which is the custodian of the data. IT practices should adhere to guidance and standards that are issued from both legal and compliance sources.
    Log information
    As part of its network management, IT maintains server logs that automatically collect and store details of how users used company online services; their telephone and/or IP addresses, time of contact, duration of contact, etc.; the browser type used and the times and dates of their service requests; and information gathered by cookies on the website. From a privacy standpoint, IT, legal, and compliance should define how this information is to be used internally, how it is to be protected to guarantee the privacy and security of individuals using the company website, and under which circumstances it will be permissible to share this information.
    Employee privacy practices
    For companies in highly sensitive customer information industries (healthcare, finance, insurance, etc.), employees may often be required to interact with customers online, by telephone, or in person. During these times, sensitive information can be shared. Guided by the recommendations of its legal and compliance departments, the company should have a set of written policies that govern how employees are to treat customers and their private information, accompanied by training of all employees who are in customer-facing functions and/or come in contact with sensitive information. Similar privacy policies and procedures should be enacted for IT personnel who are tasked with managing and accessing private customer information. As part of this process, IT should maintain extensive logs that track employee, IT, and business partner access to customer information.
    Privacy compliance
    Companies should develop policies and procedures that minimally assure annual audits of information security and privacy of customer and other information critical to the enterprise, with audit cycles addressing and documenting any changes to existing information privacy practices.
    Data retention
    IT, together with business user areas, compliance, and legal, should annually review data retention policies, making and documenting revisions as needed. Data retention specifically addresses how long sensitive customer history will be maintained in corporate data stores.
    Policy development and execution
    Audit cycles and regulatory compliance
    Companies should check with their legal counsel, regulators, and auditors to determine what needs to be audited in areas of information privacy. In some cases, companies might also have internal audit procedures that their own audit and compliance teams perform. As part of the audit and compliance process, companies should take steps to ensure that their privacy policies are kept up to date with the latest regulatory and compliance rules and that policy updates are issued on a timely basis to customers, business partners, and other stakeholders.
    Policy updates and approvals
    Privacy policy updates should be immediately issued upon approval. The approval signature list for these updates should be agreed to within the company and should be fully executed before any policy update is placed into effect. All policy updates should be accompanied with immediate issuance, along with education/training for employees affected by the policy. For each policy, a historical record of all updates should be maintained.
    Policy sign offs by employees
    As part of the new employee orientation process, employees being placed into positions that involve privacy issues should be required to receive training, read policies, and sign off that they have read all policies concerning privacy before they begin their assignments. A record of all employee sign offs should be maintained.
    Violations and penalties
    Violations of privacy policies can result in serious consequences for employees and for the company. For this reason, employees should be informed that violation of privacy policies can result in disciplinary action leading up to and including termination of employment and civil and/or criminal prosecution under federal and/or state laws. Employees assuming responsibilities that involve the protection of private information should be required to read and sign off on the corporate statement on violations and penalties before they begin their assignments. The company should maintain a record of these signed employee acknowledgements that the violation/penalties memorandum has been read and understood.
    Also See More