Please enable JavaScript to view the comments powered by Disqus.

About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Comment Standards:
Comments are moderated and will not appear until the moderator has approved them.

Please submit appropriate comments. Inappropriate comments include content that is abusive, harassing, or threatening; obscene, vulgar, or profane; an attack of a personal nature; or overtly political.

In addition, no off-topic remarks or spam is permitted.

May 20, 2019

Could Federal Privacy Law Happen in 2019?

Some payments people have suggested that this could be the year for mobile payments to take off. My take? Nah. I gave up on that thought several years ago, as I've made clear in some of my previous posts. I'm actually wondering if this will be the year that federal privacy legislation is enacted in the United States. The effects of the European Union's General Data Protection Regulation (GDPR) that took effect a year ago (see this Take on Payments post) are being felt in the United States and across the globe. The GDPR essentially has created a global standard for how companies should protect citizens' personal data and the rights of everyone to understand what data is being collected as well as how to opt out of this collection. While technically the GDPR applies only to EU citizens, even when traveling outside the European Union, most businesses have taken a cautious approach and are treating every transaction—financial or informational—that they process as something that could be covered under the GDPR.

A tangible impact of the GDPR in the United States is that the state of California has passed a data privacy law known as the California Consumer Privacy Act of 2018Off-site link (CCPA) that is partly patterned after the GDPR. The CCPA gives California residents five basic rights related to data privacy:

  • The right to know what personal information a business has collected about them, where it was obtained, how it is being used, and whether it is being disclosed or sold to other parties and, if so, to whom it is being disclosed or sold
  • The right to access that personal information free of charge up to two times within a 12-month period
  • The right to opt out of allowing a business to sell their personal information to third parties
  • The right to have a business delete their personal information, except for information that is required to effect a transaction or comply with other regulatory requirements.
  • The right to receive equal service and pricing from a business, even if they have exercised their privacy rights under the CCPA.

According to the National Conference of State Legislatures (NCSL) 17 statesOff-site link have mandated that their governmental websites and access portals state privacy policies and procedures. Additionally, other states have privacy laws related to privacy, such as children's online privacy, the monitoring of employee email, and e-reader policies.

Take On Payments has previously discussed the numerous efforts to introduce federal legislation regarding privacy and data breach notification with little traction. So why do I think change is in the air? The growing trend of states implementing privacy legislation is putting pressure on Congress to take action in order to have a consistent national policy and process that businesses operating across state lines can understand and follow.

What do you think?

Photo of David LottBy David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

-payments">Retail Payments Risk Forum at the Atlanta Fed

February 11, 2019

AI and Privacy: Achieving Coexistence

In a post early last year, I raised the issue of privacy rights in the use of big data. After attending the AI (artificial intelligence) Summit in New York City in December, I believe it is necessary to expand that call to the wider spectrum of technology that is under the banner of AI, including machine learning. There is no question that increased computing power, reduced costs, and improved developer skills have made machine learning programs more affordable and powerful. As discussed at the conference, the various facets of AI technology have reached far past financial services and fraud detection into numerous aspects of our life, including product marketing, health care, and public safety.

In May 2018, the White House announced the creation of the Select Committee on Artificial Intelligence. The main mission of the committee is "to improve the coordination of Federal efforts related to AI to ensure continued U.S. leadership in this field." It will operate under the National Science and Technology Committee and will have senior research and development officials from key governmental agencies. The White House's Office of Science and Technology Policy will oversee the committee.

Soon after, Congress established the National Security Commission on Artificial Intelligence in Title II, Section 238 of the 2019 John McCain National Defense Authorization Act. While the commission is independent, it operates within the executive branch. Composed of 15 members appointed by Congress and the Secretaries of Defense and Commerce—including representatives from Silicon Valley, academia, and NASA—the commission's aim is to "review advances in artificial intelligence, related machine learning developments, and associated technologies." It is also charged with looking at technologies that keep the United States competitive and considering the legal and ethical risks.

While the United States wants to retain its leadership position in AI, it cannot overlook AI's privacy and ethical implications. A national privacy advocacy group, EPIC (or the Electronic Privacy Information Center), has been lobbying hard to ensure that both the Select Committee on Artificial Intelligence and the National Security Commission on Artificial Intelligence obtain public input. EPIC has asked these groups to adopt the 12 Universal Guidelines for Artificial Intelligence released in October 2018 at the International Data Protection and Privacy Commissioners Conference in Brussels.

These guidelines, which I will discuss in more detail in a future post, are based on existing regulatory guidelines in the United States and Europe regarding data protection, human rights doctrine, and general ethical principles. They call out that any AI system with the potential to impact an individual's rights should have accountability and transparency and that humans should retain control over such systems.

As the strict privacy and data protection elements of the European Union's General Data Privacy Regulation take hold in Europe and spread to other parts of the world, I believe that privacy and ethical elements will gain a brighter spotlight and AI will be a major topic of discussion in 2019. What do you think?

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

July 6, 2018

Attack of the Smart Refrigerator

We've all heard about refrigerators that automatically order groceries when they sense the current supply is running low or out. These smart refrigerators are what people usually point to when giving an example of an "internet-of-things" (IoT) device. Briefly, an IoT device is a physical device connected to the internet wirelessly that transmits data, sometimes without direct human interaction. I suspect most of you have at least one of these devices already operating in your home or office, whether it's a wireless router, baby monitor, or voice-activated assistant or "smart" lights, thermostats, security systems, or TVs.

Experts are forecasting that IoT device manufacturing will be one of the fastest growing industries over the next decade. Gartner estimates there were more than 8 billion connected IoT devices globally in 2017, with about $2 trillion going toward IoT endpoints and services. In 2020, the number of these devices will increase to more than 20 billion. But what security are manufacturers building into these devices to prevent monitoring or outside manipulation? What prevents someone from hacking into your security system and monitoring the patterns of your house or office or turning on your interior security cameras and invading your privacy? For those devices that can generate financial transactions, what authentication processes will ensure that transactions are legitimate? It's one kind of mistake to order an unneeded gallon of milk, but another one entirely to use that connection to access a home computer to monitor one's online banking transaction activity and capture log-on credentials.

As one would probably suspect, there is no simple or consistent answer to these security questions, but the overall track record of device security has not been a great one. There have been major DDOS attacks against websites using botnets composed of millions of IoT devices. Ransomware attacks have been made against consumers' home security systems and thermostats, forcing consumers to pay the extortionist to get their systems working again.

Some of the high-end devices such as the driverless cars and medical devices have been designed with security controls at the forefront, but most other manufacturers have given little thought to the criminal's ability to use a device to access and control other devices running on the same network. Adding to the problem is that many of these devices do not get software updates, including security patches.

With cybersecurity issues grabbing so many headlines, people are paying more and more attention to the role and impact of IoT devices. The National Institute of Standards and Technology (NIST) has begun efforts to develop security standards for cryptology that can operate within IoT devices. However, NIST estimates it will take two to four years to get the standard out.

In the meantime, the Department of Justice has some recommendations for securing IoT devices, including:

  • Research your device to determine security features. Does it have a changeable password? Does the manufacturer deliver security updates?
  • After you purchase a device and before you install it, download security updates and reset any default passwords.
  • If automatic updates are not provided to registered users, check at least monthly to determine if there are updates and download only from reputable sites.
  • Protect your routers and home Wi-Fi networks with firewalls, strong passwords, and security keys.

I see IoT device security as an issue that will continue to grow in importance. In a future post, I will discuss the privacy issues that IoT devices could create.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

June 4, 2018

The GDPR's Impact on U.S. Consumers

If your email inbox is like mine, it's recently been flooded with messages from companies you’ve done online business with about changes in their terms and conditions, particularly regarding privacy. What has prompted this wave of notices is the May 25 implementation of Europe's General Data Protection Regulation (GDPR). Approved by the European Parliament in April 2016 after considerable debate, the regulation standardizes data privacy regulations across Europe for the protection of EU citizens.

The regulation applies to both data "controllers" and data "processors." A data controller is the organization that owns the data, while the data processor is an outside company that helps to manage or process that data. The focus of the GDPR requirements is on controllers and processors directly conducting business in the 28 countries that make up the European Union (EU). But the GDPR has the potential to affect businesses based in any country, including the United States, that collect or process the personal data of any EU citizen. Penalties for noncompliance can be quite severe. For that reason, many companies are choosing to err on the side of caution and sending to all their customers notices of changes to their privacy disclosure terms and conditions. Some companies have even gone so far as to provide the privacy protections contained in the GDPR to all their customers, EU citizens or not.

The GDPR has a number of major consumer protections:

  • Individuals can request that controllers erase all information collected on them that is not required for transaction processing. They can also ask the controller to stop companies from distributing that data any further and, with some exceptions, have third parties stop processing the data. (This provision is known as "data erasure" or the "right to be forgotten.")
  • Companies must design information technology systems to include privacy protection features. In addition, they must have a robust notification system in place for when breaches occur. After a breach, the data processor must notify the data controller "without undue delay." When the breach threatens "risk for the rights and freedoms of individuals," the data controller must notify the supervisory authority within 72 hours of discovery of the breach. Data controllers must also notify "without undue delay" the individuals whose information has been affected.
  • Individuals can request to be informed if the companies are obtaining their personal data and, if so, how they will use that data. Individual also have the right to obtain without charge electronic copies of collected data, and they may send that data to another company if they choose.

In addition, the GDPR requires large processing companies, as well as public authorities and other specified businesses, to designate a data protection officer to oversee the companies' compliance with the GDPR.

There have been numerous efforts in the United States to pass uniform privacy legislation, with little or no change. My colleague Doug King authored a post back in May 2015 about three cybersecurity bills under consideration that included privacy rights. Three years later, for each bill, either action has been suspended or it's still in committee. It will be interesting to see, as the influence of the GDPR spreads globally, whether there will be any additional efforts to pass similar legislation in the United States. What do you think?

And by the way, fraudsters are always looking for opportunities to install malware on your phones and other devices. We've heard reports of the criminal element using "update notice" emails. The messages, which appear to be legitimate, want the unsuspecting recipient to click on a link or open an attachment containing malware or a virus. So be careful!

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed