Technology Requires Responsibility


By Samuel Greengard

By now, most of us know that it's far easier to invent technology than apply morals, ethics and practices in an intelligent and consistent way. For example, business and IT executives now have more information about consumers at their fingertips than at any point in history. The upside is that these businesses are able to engage customers more effectively, and ads and marketing endeavors are more targeted and relevant.

The downside is a general degradation of privacy and a growing misuse of big data. A couple of years ago, retailing giant Target completely missed the target. As a result, a man stormed into a store in Minneapolis and demanded to know why his teenage daughter was receiving promotions for maternity clothing, cribs and similar items.

At first, he thought it was merely misdirected advertising. Then he went home and found out she was pregnant. Oops.

As Target discovered, there's a fine line between what use of information is acceptable and what constitutes an invasion of privacy. Target's big data quest worked a little too well, and it lacked the human thinking and oversight necessary to keep things from sliding into the creepy category.

However, many companies continue to push the limits of technology and back off only when there's bad press or a dinged brand image.

In the future, what happens if a job candidate falls into a high-risk category for lying or cheating? What if a consumer pops up as a poor loan risk or an algorithm says that she is likely to develop cancer by age 40?

Will companies unilaterally eschew a job offer, a loan or medical insurance, even when there's no certainty about the situation?

Worse, what happens if the data is flawed?