Big & small data - Gender Gap digitale

Preview:

Citation preview

Big and Small dataDonne e nuove tecnologie: contrastare il gender gap

Eleonora PantòLearning, Inclusion e Social Innovation

30 novembre 2017Torino CLE

2

2011 Software is eating the world

3

Evolution of the desk

4

Computational thinking

Dottori, avvocati, insegnanti, allevatori, un qualunque mestiere. Il futuro di tutte queste professioni sarà pieno di Pensiero Computazionale. Medicina basata sui sensori, smart contract, analisi di dati nell'educazione, agricoltura di precisione: il successo dipenderà da quanto sarete bravi con il pensiero computazionale. Mi sono accorto di una tendenza interessante. Scegliete un qualsiasi settore X, dalla Archeologia alla Zoologia. Ci sono due possibilità: o esiste già “X Computazionale” o esisterà presto. E tutti lo considerano il futuro di quel settore.

http://blog.stephenwolfram.com/2016/09/how-to-teach-computational-thinking/

5

Scarsa presenza femminile nelle ICT

85% dei tecnici di Facebook and Yahoo sono uomini

• Appcamp4girl

• Girlswhocode

• Women’s code collective

• Railgirls RubyRails for Girls (Finland)

• Shine for girls - Learning math throughdance

Megan Smith è stata CTO della Casa Bianca, in qualità di vicepresidente Google ha lanciato la campagna “Google’s Made With Code” per avvicinare le ragazze alla programmazione.

6

…1961 Hidden figures

Three brilliant African-American women at NASA -- Katherine Johnson (Taraji P. Henson), Dorothy Vaughan (Octavia Spencer) and Mary Jackson (Janelle Monáe) -- serve as the brains behind one of the greatest operations in history: the launch of astronaut John Glenn (Glen Powell) into orbit, a stunning achievement that restored the nation's confidence, turned around the Space Race and galvanized the world.

7http://www.npr.org/sections/money/2014/10/21/357629765/when-women-stopped-coding

8

Donne e futuro del lavoro

Across all industries, almost half of respondents – 44% – said that both unconscious bias among managers and a lack of work-life balance were significant barriers to gender diversity in the workplace. Almost as many – 39% – pointed to a lack of female role models. Although women now outnumber men at university, and graduate in higher numbers, 36% of respondents still said there weren’t enough qualified women for the positions they’re looking to fill. Only 6% blamed a lack of parental leave, and 10% said there were no barriers.

http://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf

Unconscious Bias

10

Hired by a computer ?

http://www.pewinternet.org/2017/10/04/americans-

attitudes-toward-hiring-algorithms/

Survey respondents were asked to read and respond to the following scenario: “Today, when companies are hiring they typically have someone read applicants’ resumes and conduct personal interviews to choose the right person for the job. In the future, computer programs may be able to provide a systematic review of each applicant without the need for human involvement. These programs would give each applicant a score based on the content of their resumes, applications or standardized tests for skills such as problem solving or personality type. Applicants would then be ranked and hired based on those scores.”

11

“Even more remarkable — and even less widely understood — is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed.” — Report to the President and Congress: Designing a digital future (2010)

Algoritmo

https://www.hsdl.org/?abstract&did=10223

12

Algoritmo

“The Industrial Revolution automated manual work and the Information Revolution did the same for mental work, but machine learning automates automation itself. Without it, programmers become the bottleneck holding up progress. With it the pace of progress picks up” — Pedro Domingos, The Master Algorithm

https://www.youtube.com/watch?v=B8J4uefCQMc

13

Le AI e i pregiudizi

AI has the potential to

reinforce existing biases

because, unlike humans,

algorithms are unequipped

to consciously counteract

learned biases, researchers

warn.Photograph: KTS Design/Getty Images/Science

Photo Library RF

https://www.theguardian.com/technology/2017/apr/13/ai-programs-exhibit-racist-and-sexist-biases-research-reveals#img-1

14

Tay, il chatbot sessista

https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist

" The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation."Unfortunately, the conversations didn't stay playful for long. Pretty soon after Taylaunched, people starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpist remarks. And Tay — being essentially a robot parrot with an internet connection — started repeating these sentiments back to users, proving correct that old programming adage: flaming garbage pile in, flaming garbage pile out.

15

Combattere i pregiudizi degli algoritmi

https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms

• Who codes

matters

• How we code

matters

• And Why we

code matters

https://www.wired.com/story/machines-taught-by-photos-learn-a-sexist-view-of-women/

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a

photo set generally associated women with cooking, software trained by studying those photos and their

labels created an even stronger association.

https://homes.cs.washington.edu/~my89/publications/bias.pdf

17

Why algorithms aren’t working for women

http://www.feministcurrent.com/2017/04/07/algorithms-arent-working-women/

But the real way to start changing this

technology is to make sure that we all get

involved. That means having more

conversations about it, learning more about it,

and really taking seriously the fact that the

technology you use impacts you and the

world around you.

18

Math doesn’t cause bias, and Big Data is only partly to blame. The biggest source of bias in data analysis is and always will be people, both technical and business people, failing to admit that bias exists, failing to look for it, and failing to do anything constructive about it.

Put biased data into an unbiased equation and you get biased results.

I DATA ANALYST PERDONO DI VISTA LE PERSONE

19

I big data non forniscono idee nuove.

I big data sono dati, e i dati danno la priorità

all’analisi rispetto alle emozioni. È difficile

immaginare che i dati possano descrivere le qualità

emotive a cui attribuiamo più valore: bello, cordiale,

sexy, sorprendente, carino.

Grazie

+ 39 0114815139

epanto

Eleonora Pantò | CSP

eleonora.panto@gmail.com

twitter.com/epanto

pinterest.com/epanto

facebook.com/eleonorapanto

Recommended