How could algorithms put individuals and communities in harm’s way?

4 min read

Algorithms prejudices Norbert Biedrzycki blog

As distant and aloof as mathematical equations may seem, they are also commonly associated with reliable, hard science. Every now and then, it nevertheless turns out that a sequence of numbers and symbols conceals a more ominous potential. What is it that causes applications, which otherwise serve a good cause, to go bad? There could be any number of reasons. One of the first ones that spring to mind has to do with human nature. People are known to follow a familiar mechanism of letting stereotypes and prejudices guide their lives. They apply them to other individuals, social groups, and…...

This article is free to read

Login to read the full article


OR
Norbert Biedrzycki Head of Services CEE at Microsoft. Leads Microsoft services in 36 countries which include business and technology consulting, in particular in areas such as big data and AI, business applications, cybersecurity, premium and cloud services. Previously, as a Vice President Digital McKinsey, responsible for CEE, providing holistic combination of strategic consulting, digital transformation through rapid deployment of business applications, big data solutions and advanced analytics, business use of artificial intelligence, blockchain and IoT. Prior to that, Norbert was as the President of the Management Board and CEO of Atos Polska, and was also the CEO of ABC Data S.A. and the President of the Management Board and CEO of Sygnity S.A. He had previously also worked for McKinsey as a partner and, at the beginning of his career, he was the head of Oracle's consulting and business development services. Norbert's passion is technology – he is interested in robotization, automation, Artificial Intelligence, blockchain, VR, AR, and IoT and the impact modern technologies have on our economy and society. You can read more on this on his blog.

Follow DDI

Gain Access to Expert Views

We won't send you spam. Unsubscribe at any time.