The first I heard of differential privacy was when Apple started to use the term to describe how they use advanced data techniques to make smart recommendations without reducing privacy. This paper is filled with more math than I can follow, but it explains a bit more about what differential privacy really is.

Differential privacy formalizes the idea that a query should not reveal whether any one person is present in a dataset, much less what their data are. Imagine two otherwise identical datasets, one with your information in it, and one without it. Differential Privacy ensures that the probability that a query will produce a given result is nearly the same whether it’s conducted on the first or second dataset.

If your working on systems that collect user data and also want to protect privacy this would be a good read.