
In the recent edition of “Oh Behave!,” the annual cybersecurity survey conducted by the National Cybersecurity Alliance, an alarming 44 percent of respondents said they had experienced cybercrime leading to the loss of data or money. This next statistic may explain why: Only 41 percent of respondents said they use multifactor authentication (MFA) regularly. Learn more about the history of MFA in this edition of Tech Time Warp.
It’s not that MFA is new. Attempts to promote use of two-factor authentication (2FA) and MFA have been around for decades. President Barack Obama even wrote a 2016 op-ed in The Wall Street Journal promoting 2FA in order to kick off a national awareness campaign. The goal, Obama wrote, was to “encourage more Americans to move beyond passwords—adding an extra layer of security like a fingerprint or codes sent to your cellphone.”
MFA’s roots
Today’s (admittedly sometimes frustrating) MFA requirements are rooted in the work of Fernado Corbáto. He was a professor working on the timesharing computing projects in the early 1960s at the Massachusetts Institute of Technology. Corbáto invented a password system to allow multiple users to share in a computer’s processing power without having access to their colleagues’ files. (This worked until a graduate student named Allen Scherr printed a list of these user passwords in order to bypass time limits on computer use.)
Thus the need for 2FA and even MFA started to become apparent. In 1983, RSA obtained a patent for an encryption algorithm that required both a public and a private key. In 1998, AT&T received a patent for an “automated method for alerting a customer than a transaction is being initiated and for authorizing the transaction based on a confirmation/approval by the customer thereto.” That’s a mouthful, but it’s 2FA.
Did you enjoy this installation of SmarterMSP’s Tech Time Warp? Check out others here.
Photo: newafrica / Shutterstock
This post originally appeared on Smarter MSP.