While dining out with clients in New York in 1949, Frank McNamara was embarrassed to find that he’d left his wallet at home. While his wife bailed him out and paid for the meal, the experience gave McNamara an idea: What if there was a way he could purchase the meal on credit and pay for it at the end of the month?

It was the beginning of the invention of the modern credit card.

Within days, McNamara and his partners had developed the idea for the Diners Club card, which became the first contemporary credit card. In 1950, McNamara and his attorney, Ralph Schneider, launched Diners Club International, which was originally accepted by 27 New York restaurants and used by about 200 of their friends and acquaintances.

How did a dining card evolve into today’s credit cards?

Since the 1920s, some department stores had offered metal charge plates that allowed customers to charge purchases, but they were only good at the issuing retailer. The Diners Club card was the first to allow purchases at various restaurants—and its rapid success showed that consumers were hungry for such credit options. By the end of 1950, Diners Club had 20,000 members, and by the end of 1951, it had 42,000.

The club charged members $5 a year to belong, and restaurants paid 7 percent of the purchases made at their establishments. Members were required to pay their bills in full at the end of each month.

Other companies began developing similar products. In 1958, a number of companies including American Express and Bank of America introduced credit cards. The Bank of America card was unique: While Diners Club and its earlier competitors were accepted only at restaurants and travel and entertainment outlets, the Bank of America card was accepted by a variety of different types of merchants. And some customers were allowed to revolve balances and pay interest on them, which was new.

The BankAmericard, originally used only in California, eventually started licensing cards to banks in other states. That meant banks across the country could issue cards that would be accepted by all the merchants who accepted BankAmericard.

To compete with Bank of America, a group of other California banks launched the Interbank Card Association (ICA) in 1966. Eventually, BankAmericard became Visa and ICA became MasterCharge (and later MasterCard). These two international networks still act as middlemen between card issuers and merchants.

And the original credit card is still around, although it’s more common internationally than in the United States. In 1981, Citigroup acquired Diners Club International, and it was acquired again in 2009 by Discover Financial Services. In 2009, BMO Financial Group acquired exclusive rights to issue Diners Club Cards to corporate and personal clients in the United States and Canada. (Diners Club Card partnered with MasterCard in 2004, and is now accepted at over 38 million Mastercard merchant locations worldwide.)

How did rewards get started?

In the early days of credit cards, you could only use your card at the merchants that had an agreement with your card issuer. Some restaurants, airlines and stores accepted Visa or its precursor, while others accepted MasterCard (or MasterCharge).

At the time, it made sense to choose your card based on the merchants who accepted it. Most users selected the card that was accepted at the stores or restaurants they used the most.

But over time, as credit cards became more widely accepted and many merchants started accepting multiple types of cards, they needed a new competitive advantage. To sway consumers to use their credit cards, card companies began offering rewards.

In 1986, Discover Financial Services launched its “cash back” program, which gives cash back to the cardholder at the end of each year, in an amount based on the total charges placed on the card during the year. In 1990, the AT&T Universal Card was introduced, offering cash back on each purchase that could be applied to the cardholder’s phone bill.

Other credit card companies took notice, and loyalty rewards have become one of the driving forces behind the growth of the credit card industry. Airline miles, hotel points, cash back and other rewards remain popular among credit cardholders today.

How has technology changed credit cards?

The first Diners Club cards were made of cardboard and included the cardmember’s name and account number. In 1961, Diners Club introduced plastic cards, and by the mid-1960s, Diners Club had 1.3 million cardholders.

With plastic cards, the cardholder’s name and account number could be raised, so that merchants could quickly make an imprint of the card and record the charges. This method was widely used until 1980, when the magnetic stripe took its place.

Beginning around 1980, most credit cards had a magnetic stripe across the back, which contained information including the cardholder’s name and account number. Merchants could swipe the stripe on a card reader, and the information could be sent directly to a cardholder’s bank. Within seconds, the card issuer could confirm whether the cardholder had sufficient credit to cover the purchase, and either approve or decline the request.

While the magnetic stripe helped prevent credit card fraud by shortening the time between purchase and approval from the card issuer, it wasn’t perfect. In time, fraudsters learned to lift the information from the magnetic stripe, which made it easy for them to steal from cardholders.

In recent years, magnetic stripe cards gave way to chip and pin cards. The chips embedded in today’s credit cards encrypt all the information about the cardholder and his or her account, making it more difficult for thieves to access.

What's the credit card of the future look like?

Today, the credit card industry is more about credit than it is about cards. While chip and pin technology makes credit cards safer, increasing numbers of clients purchase goods and services with their mobile devices using tools like Apple Pay and Google Pay.

Smartphones, smart watches and other wearables are integrating wireless payment technologies, which may take precedence over the plastic in our wallets. And eventually, scans of a cardholder’s fingerprint or retina may become the most common way to charge purchases.

When McNamara created the first cardboard credit cards almost three-quarters of a century ago, he likely never imagined that he was paving the way for a multi-trillion-dollar industry. His vision of a more convenient way to pay has become a longtime quest for new and better technologies and has revolutionized the way Americans use and think about money.

This article contains the current opinions of the author, but not necessarily those of Acorns. Such opinions are subject to change without notice. This article has been distributed for educational purposes only and should not be considered as investment advice or a recommendation of any particular security, strategy or investment product. Information contained herein has been obtained from sources believed to be reliable, but not guaranteed.