Please enable JavaScript to view the comments powered by Disqus.

COVID-19 RESOURCES AND INFORMATION: See the Atlanta Fed's list of publications, information, and resources; listen to our Pandemic Response webinar series.

About


Take On Payments, a blog sponsored by the Retail Payments Risk Forum of the Federal Reserve Bank of Atlanta, is intended to foster dialogue on emerging risks in retail payment systems and enhance collaborative efforts to improve risk detection and mitigation. We encourage your active participation in Take on Payments and look forward to collaborating with you.

Comment Standards:
Comments are moderated and will not appear until the moderator has approved them.

Please submit appropriate comments. Inappropriate comments include content that is abusive, harassing, or threatening; obscene, vulgar, or profane; an attack of a personal nature; or overtly political.

In addition, no off-topic remarks or spam is permitted.

April 10, 2017

Catch Me If You Can

I recently became intrigued with a reality network television show that pitted teams of two everyday people (the "fugitives") against a diverse and highly experienced team of former law enforcement, military, and intelligence investigators (the "hunters"). The goal of the contest was for the fugitive team, given a one-hour head start, to elude capture for 28 days so they could collect a prize of $250,000 in the end. The fugitives were given a pot of $500, available only from an ATM, that they could use over the 28 days. But they had a $100 daily limit—and the knowledge that the hunters would be notified of the ATM location immediately. My interest was increased by the location: the fugitives' geographic boundaries were in the Southeast, with Atlanta as the hub, so there were frequent shots of local places that I recognized and had visited.

Underneath the entertainment value was a demonstration of the classic conflict between personal privacy and big-data analytics. This issue has become increasingly complicated as data collection, storage, and analytics have advanced and become less expensive, faster, and more sophisticated. At the same time, people are participating more in electronic communications, transactions, and other activities that create electronic footprints that can be tracked and analyzed. The show demonstrated these collection capabilities numerous times as the investigators poured over bank account transactions, phone records, social media, property and vehicle databases, and other information to identify clues as to the team's location or the people that might be assisting them.

Two of the nine fugitive teams were successful. In subsequent interviews, both teams cited a key factor they believed was critical to their success. They minimized or eliminated their use of cell phones, email, and social media—going off the grid—to avoid giving hints about their location. Knowing that their location would be signaled whenever they used an ATM to get money, they would have already made arrangements to leave the area immediately, before the hunters closed in. Several of the unsuccessful contestants remarked how amazed they were to discover the wide range of information the investigators were able to access about them, their family, and their friends. Some didn't know their location could be tracked through a cell phone or a photograph posted on social media.

Of course, these contestants, as well as any families and friends who might help them, had to sign numerous waivers to allow the investigators to access and collect much of this information. But how much information would be available without such a waiver or court order? In 2015, the European Union adopted an information privacy directive that is generally viewed as highly protective of an individual's privacy. In the United States, there have been discussions over recent years about similar legislation without much headway, mostly because of differences between there and here about data collection as well as First Amendment infringement.

Does there need to be increased transparency by companies that collect data for marketing purposes? Would clearer disclosures make consumers less likely to participate in rewards programs and other activities that involve data collection, to closely guard their personal information and interests? As always, we welcome your feedback.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

September 26, 2016

AdmiNISTering Passwords: New Conventional Wisdom

I have lived long enough to go through several cycles of "bad" foods that are now deemed not to be so bad after all. In the 1980s, we were warned that eggs and butter were bad for your heart due to their level of cholesterol. Now, decades of nutritional studies have led to a change in dietary guidelines that take into account that eggs provide an excellent source of protein, healthy fats, and a number of vitamins and minerals. Similar reversals have been issued for potatoes, many dairy products, peanut butter, and raw nuts.

Much to my surprise, much of the old, conventional wisdom about passwords has been spun on its heels with proposed digital authentication guidelines from the United States National Institute for Standards and Technology (NIST) and an article from the Federal Trade Commission's (FTC) Chief Technologist Lorrie Cranor regarding mandatory password changes. Some of NIST's recommendations include the following:

  • User-selected passwords should be a minimum of 8 characters and a maximum of 64 characters. Clearly size does matter as generally the longer the password, the more difficult it is to compromise
  • A password should be allowed to contain all printable ASCII characters including spaces as well as emojis.
  • Passwords should no longer require the user to follow specified character composition rules such as a combination of upper/lower case, numbers, and special characters.
  • Passwords should be screened against a list of prohibited passwords—such as "password"—to reduce the choice of easily compromised selections.
  • They should no longer support password hints as they often serve like a backdoor to guessing the password.
  • They should no longer use a knowledge-based authentication methodology—for example, city where you were born—as data breaches and publicly obtainable information has made this form of authentication weak.

The FTC's Cranor argues in her post that forcing users to change passwords at a set interval often leads to the user selecting weak passwords, and the longstanding security practice of mandatory password changes needs to be revisited. Her position, which is backed by recent research studies, is consistent with but not as strong as NIST's draft guideline that says that users should not be forced to change passwords unless there has been some type of compromise such as phishing or a data breach. Cranor's post does not represent an official position of the FTC and recommends that an organization perform its own risk-benefit analysis of mandatory password expiration and examine other password security options.

So while I finish my breakfast of eggs, hash browns (smothered and covered, of course), and buttered toast washed down with a large glass of milk, I will continue to ponder these suggestions. I would be interested in your perspective so please feel free to share it with us through your comments.

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

April 18, 2016

"I want to be alone; I just want to be alone"

This was spoken forlornly by the Russian ballerina Grusinskaya in the 1932 film Grand Hotel by the famously reclusive screen star Greta Garbo. This movie line causes me to occasionally wonder why we all can't just be left alone. Narrowed to payments, why does paying anonymously have to indicate you are hiding something nefarious?

Some of you may be asking why it would be necessary to hide anything. I offer the following examples of cases when someone would want to pay anonymously, either electronically or with cash.

  • Make an anonymous contribution to a charitable or political organization to avoid being hounded later for further contributions.
  • Make a large anonymous charitable contribution to avoid attention or the appearance of self-aggrandizement.
  • Recompense someone in need who may or may not be known personally with no expectation or wish to be repaid.
  • Pay anonymously at a merchant to avoid being tracked for unwelcome solicitations and offers.
  • Make a purchase for a legal but socially-frowned-upon good or service.
  • Shield payments from scrutiny for medical procedures or pharmacy purchases that are stigmatized.
  • Personally, use an anonymous form of payment to avoid letting my wife find out what she will be getting as a gift. (Don't worry; my spouse never reads my blogs so she doesn't know she needs to dig deeper to figure out what she is getting.)

Some of these cases can be handled easily with the anonymity of cash. As cash becomes less frequently used or accepted or perhaps even unsafe or impractical, what do we have as an alternative form of payment? Money orders such as those offered by the U.S. Postal Service are an option. The postal service places a cap of $1,000 on what can be paid for in cash. Nonreloadable prepaid cards such as gift cards offer some opportunity as long as the amount is below a certain threshold. Distributed networks like bitcoin offer some promise but may come with greater oversight and regulations in the future. Some emerging payment providers claim to offer services tailored for anonymous payments. Still, though, the future for a truly anonymous, ubiquitous payment alternative like cash doesn't look promising, given the current regulatory climate.

I acknowledge that one needs to find a proper balance between vigorously tackling financial fraud, money laundering, and terrorist financing and the need that I think most of us share for regulators and others to keep out of our personal business unless a compelling reason justifies such an intrusion. Consequently, we should be scrupulous about privacy but offer the investigatory tools when payments are used for nefarious purposes to identify the activities and the people involved. In many ways, this balancing act dovetails with the highly charged debate going on between the value of encryption and the needs of law enforcement and intelligence agencies to have the investigatory tools to read encrypted data. As Greta Garbo famously said and perhaps inadvertently foretold, some of us just want to be left alone.

Photo of Steven Cordray By Steven Cordray, payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed

July 13, 2015

Biometrics and Privacy, or Locking Down the Super-Secret Control Room

Consumer privacy has been a topic of concern for many years now, and Take on Payments has contributed its share to the discussions. Rewinding to a post from November 2013, you'll see the focus then was on how robust data collection could affect a consumer's privacy. While biometrics technology—such as fingerprint, voice, and facial recognition for authenticating consumers—is still in a nascent stage, its emergence has begun to take more and more of the spotlight in these consumer privacy conversations. We have all seen the movie and television crime shows that depict one person's fingerprints being planted at the crime scene or severed fingers or lifelike masks being used to fool an access-control system into granting an imposter access to the super-secret control room.

Setting aside the Hollywood dramatics, there certainly are valid privacy concerns about the capture and use of someone's biometric features. The banking industry has a responsibility to educate consumers about how the technology works and how it will be used in providing an enhanced security environment for their financial transaction activities. Understanding how their personal information will be protected will help consumers be likelier to accept it.

As I outlined in a recent working paper, "Improving Customer Authentication," a financial institution should provide the following information about the biometric technology they are looking to employ for their various applications:

  • Template versus image. A system collecting the biometric data elements and processing it through a complex mathematical algorithm creates a mathematical score called a template. The use of a template-based system provides greater privacy than a process that captures an image of the biometric feature and overlays it to the original image captured at enrollment. Image-based systems provide the potential that the biometric elements could be reproduced and used in an unauthorized manner.
  • Open versus closed. In a closed system, the biometric template will not be used for any other purpose than what is stated and will not be shared with any other party without the consumer's prior permission. An open system is one that allows the template to be shared among other groups (including law enforcement) and provides less privacy.
  • User versus institutional ownership. Currently, systems that give the user control and ownership of the biometric data are rare. Without user ownership, it is important to have a complete disclosure and agreement as to how the data can be used and whether the user can request that the template and other information be removed.
  • Retention. Will a user's biometric data be retained indefinitely, or will it be deleted after a certain amount of time or upon a certain event, such as when the user closes the account? Providing this information may soften a consumer's concerns about the data being kept by the financial institution long after the consumer sees no purpose for it.
  • Device versus central database storage. Storing biometric data securely on a device such as a mobile phone provides greater privacy than cloud-based storage system. Of course, the user should use strong security, including setting strong passwords and making sure the phone locks after a period of inactivity.

The more the consumer understands the whys and hows of biometrics authentication technology, I believe the greater their willingness to adopt such technology. Do you agree?

Photo of David Lott By David Lott, a payments risk expert in the Retail Payments Risk Forum at the Atlanta Fed