Correlating Time-Domain Noise to Detect Fraudulent Credit Cards

Will Liu and Andrew Wiens

ESE 498 Senior Design Project

Spring 2013

LabView code

Today’s market is dominated by credit card transactions. There are many ways to counterfeit credit cards, which creates problems in credit card security. A system to detect fraudulent credit cards is needed, and the system must be robust, consistent, and reliable as well as taking cost into consideration. We propose a credit card security system where the task is to create a unique fingerprint of each individual card called a Magneprint™. Fraudulent counterfeits can then be identified when their Magneprints™ are compared to the authentic card. For the system to be robust, consistent, cost-efficient and reliable, it must have minimal rates of false positives and false negatives as well as small Magneprint™ sizes.

 

Project Specifications

The Magneprint™ sizes allowed for performance testing are 384, 768, 1024, 2048 bits. The rate of false positives and false rejects must be minimized and for our scope and initial stages of testing it must be less than 0.5%.

 

Literature Review

The Magneprint™ patents provide much information about the nature of the noise found on magnetic cards as well as implementations to generate a fingerprint from the noise. Figure 1 appears in U.S. patent no. 7,478,751 and shows an example of magnetic noise on a credit card as captured by an analog to digital converter while swiping a card through a card reader. High frequency noise can be found between the data peaks.

Figure 1 Noise region in the data track

Figure 1 Noise region in the data track 

In order to isolate the high frequency noise to create a fingerprint of the credit card, the low frequency components of the waveform have to be removed. The same patent describes a method to remove low frequencies from the waveform using a mean smoother (moving average filter). The mean smoother subtracts the average of n neighboring samples from each sample. This filter is useful for creating Magneprints™ because it has a steep response in the frequency domain. Also, because the filter has a finite impulse response, energy from data peaks in the waveform do not affect the noise region. A block diagram of the filter and an example of a waveform filtered with the mean smoother are shown below as they appear in the patent. Figure 2 shows the block diagram for the mean smoother and Figure 3 shows an example waveform produced by the mean smoother.

Figure 2 Mean smoothing filter

Figure 2 Mean smoothing filter
Figure 3 Output from mean smoother

Figure 3 Output from mean smoother

We have chosen LabView 2012 as our tool to build the Magneprint™ system for its utility and popularity in academia as well as per recommendations from our advisor Dr. Morley. The scheme of the system is to capture unique “fingerprints” made of noise from individual cards and build a database. Known authentic and inauthentic cards are correlated and the results displayed on histograms to check for system performance, the details of which will be discussed later in this section.

 

Design

The general mechanism of our system is to first capture a waveform of a card, which will present with data and noise. Since our database will contain only forward containing Magneprints™, we will flip the waveforms from all backwards swipes. Then we normalize the waveforms so that they can be properly correlated. The waveform is then put through various signal processing techniques such as taking only specified “chunks” of the whole waveform, a moving average filter, decimation, taking subsamples, and quantization to create the Magneprint™. The parameters for the Filter, Decimate (and subsample), and Quantize VIs are determined with another optimization VI explained later in this section. In the end each card is correlated with the rest and the results are displayed on histograms. The top-level LabView VI is shown in Figure 4 below. It shows our implementation at the highest level, a pipeline architecture of 1D array of waveforms. Part of the credit goes to Dr. Morley for his code in acquiring a raw waveform from DAT files.

 

Top level of our design

Figure 4 Our LabView design

 

Results

Figure 5 shows the results after running the top level VI using a database of swipes provided by Dr. Morley. There were a total of 264 swipes in the database containing 75 different cards and around 3 re-swipes for each card, with some additional swipes swiped backwards. The parameters used were: Filter=17, Delta=89, M(subsamples to take)=96. B(number of bits)=4.  As you can see for the reject distribution the histogram is a near perfect Gaussian distribution centered around 0. The histogram is also fairly tight, meaning the standard deviation is small, with the range above and below -0.4 to 0.4. The accept distribution also looks excellent, with no correlations below 0.9. This means that the two distributions do not overlap, and so there are no false positive swipes (a card deemed authentic when it is not) and no false negatives (a card deemed inauthentic when it is). This further demonstrates the robustness and accuracy of our system.

Distributions

Figure 5 Accept and reject distributions generated from Dr. Morley’s card database

Separation is defined as:

1

Where:
2= mean of the accept distribution
3= standard deviation of the accept distribution
4= standard deviation of the reject distribution

Separation S is used as a rating of performance for our system. The higher the separation, the smaller the rates of false positives and false negatives, because the accept and reject distributions have less overlap. That way we can pick a threshold that determines if a card is authentic or not that does not touch the tails of the accept and reject distributions. 

Figure 6 shows the optimizer VI, a VI that sweeps the parameter values and calculates separation for each parameter combination. The combination of parameters that give the largest S is chosen and displayed on the LabView front panel for M*B sizes of 384, 768, 1024, and 2048.  Delta can take on any value that does not overflow the smallest Magneprint™ in the database, while Filter can take these values: 17, 22, 27, 32, 37. M and B can be any factor combinations of 384, 768, 1024, and 2048 for each M*B.

5

Figure 6 Optimizer VI (CLICK TO ENLARGE)

This VI is run using swipes from the database. The results containing parameters achieving highest separation is shown in Table 1 below.

6

Table 1 Best M, B, filter half-width, and spacing (delta) parameters

This optimization program is very computationally intensive and it took around 2 hours to complete on a quad core PC.

 

Conclusions

We were able to determine various parameters for the best separation for each M*B size as detailed in Table 1, with the separation being at least 0.642, which is excellent in reducing the rate of false positives and false negatives. The live swiping of cards confirmed our working system, which was able to identify authentic and fraudulent cards base on setting a correlation threshold of 0.8; our system is very robust and reliable and is an accurate way to distinguish authentic cards from copied cards.

Future areas of exploration include hardware implementation of the system, optimizing cost vs. robustness for commercial use, and after that commercial testing. In addition, our implementation could use less computing time and memory by processing the cards on the fly.

 

References

  1. “Method and apparatus for authenticating a magnetic fingerprint signal using a filter capable of isolating a remanent noise related signal component,” with R. E. Morley, R. S. DeLand, E. C. Limtao, E. J. Richter, and S. R. Wood U.S. Patent No. 7,478,751, January 20, 2009
  2. "Magnetic stripe card verification system," with T. C. McGeary and R. S. DeLand, Jr. U.S. Patent No. 6,098,881, August 8, 2000.