Guide to 21 century HR - Assessment Systems

Preview:

Citation preview

We Know People

Guide to 21st century HR hitchikers galaxy

Zsolt Feher | Managing Director / Europe

We Know People

10 unavoidable points when You make your decisions as HR leader

Zsolt Feher | Managing Director / Europe

Generation Y

“The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants…. They contradict their parents, chatter before company…. and tyrannize their teachers.”

Socrates

Hogan is HR analytics and Big Data

Proved that personality

predicts occupational

performance.

Discovered how leadership

has financial consequences.

We showed that personality

predicts leadership

performance—who you are

determines how you lead.

We identified 11 dark side

personality factors that derail

leaders and organizations.

We demonstrated the need

to distinguish between

leader emergence and

leader effectiveness.

1980 1990 2000 2005 2017

HR analytics that works

Assessments provide an

unbiased and scientific

basis for making

informed decisions

about people.

Business success

depends on making

good decisions

about money and

people.

Using data to

support decisions

about people is

always best practice.

What’s the challenge?

1. Many new solutions on the market

2. What works and what doesn’t, how do you know?

3. Data is powerful but we don’t use it well

TOP 10

1. Tool usage at large does not change

2. False need - trying to justify a mobile first strategy, shorter assessment is not better

3. Snake oil is all around the market

4. Wrong tech focus from startups

5. A/I is only A/S without good data

6. Validity is still the engine, but not everyone has it

7. The square of preferences: cost – accuracy - u/x - fairness

8. The ‘left out’ factor

9. Dashboarding is not analytics

10. Speed of change

1. Tool usage at large

2. False need

Shorter is better?

Mobile first?

Mobile First

Source: Jobvite. 2018 Job Seeker Nation Study.

Mobile First?

Surveying pre-hire assessment takers

N = 1,063 applicants; Mean age = 39.5

39%

45%

6%11%

0%

10%

20%

30%

40%

50%

Desktop Laptop Tablet Smartphone

Device Usage

Smartphone users reported the lowest rate of

distraction (8%).

Mobil First?

Surveying pre-hire assessment takers

N = 1,063 applicants; Mean age = 39.5

41%

51%

4% 3%

40%

49%

4% 7%

39%45%

6%11%

0%

10%

20%

30%

40%

50%

60%

70%

Desktop Laptop Tablet Smartphone

Device Usage by Year

2015 2016 2017

Mobile First?

Surveying pre-hire assessment taker

N = 1,063 applicants; Mean age = 39.5

41,539,0

40,0

34,4

39,5

25

30

35

40

45

50

Desktop Laptop Tablet Smartphone Overall

Mean Age by Device

Mobile First?

Oregon State Univ. & Shaker Research (c.f., Hardy, Gibson, Sloan & Carr(2017)

• Mobile assessments took longer • 1/3 of entry-level demanded mobile assessments• Only 1-3% of higher-level job candidates used a mobile device

The shorter the assessment, the better?

• Do shorter assessments reduce dropout?1

• Most dropout occurs in the first 10 mins• Flat after that at 1-2% • Mostly “good attrition” – poor candidates drop out

1. Oregon State Univ. & Shaker Research (c.f., Hardy, Gibson, Sloan & Carr (2017)

3. Snake oil is all around

Do not fall for ‘look and feel’

Always check the science

1. What do the tools actually measure?

2. What backgrounds and professional affiliations do the tools developershave?

3. Have the tools been peer-reviewed or reviewed by unbiased thirdparties?

4. Do the tools adhere to any employment guidelines or standards?

5. Are the tools accompanied by technical reports or validation studies?

6. Are the tools appropriate for the job under consideration?

7. How is the performance of the tools measured?

8. Are the tools and products adapted to different cultures and supportedlocally?

What to consider? (The viscous 8)

4. Wrong focus from startups

5. A/I vs. A/S

Artificial Intelligence without good data and expert

guidance is only Artificial Stupidity

6. Validity above all still matters

7. The square of preferences

COST

ACCURACY

FAIRNESS

USER EXPERIENCE

Accuracy

• The fundamental goal is accurate prediction

• Better measurement means better prediction

• Better prediction means fewer errors

• Errors are costly

Cost

• Cost always will be a consideration

• New technologies are successful when cost goes down and

effectiveness goes up

• Consider total cost of ownership

• AI/ML applications have promising cost implications

• Cost doesn’t seem to be the key barrier preventing use of

current, accurate measurement methodologies

User Experience

• User experience is tied to brand

• How do your trade-off decisions reflect on your brand?

• Individual differences in user experience

• Demographic differences in user experience

• Digital nativity is one variable impacting user experience

Fairness

• Legal

• Tied to accuracy

• Candidate perceptions

8. The left out factor

9. Dashboarding is not analytics

10. Speed of change

TOP 10

1. Assessment usage at large does not change

2. False need - trying to justify a mobile first strategy, shorter assessment is not better

3. Snake oil is all around the market

4. Wrong tech focus from startups

5. A/I is only A/S without good data

6. Validity is still the engine, but not everyone has it

7. The square of preferences: cost – accuracy - u/x - fairness

8. The ‘left out’ factor

9. Dashboarding is not analytics

10. Speed of change

Careful selection, do

not fall for the hype

What works, WORKS!!

Use these 10 points to make your decision, and go for it!

How do you defeat the challenge?

1 2 3

Contact

ZSOLT FEHER

11 S. Greenwood | Tulsa, OK 74120

+ 36 20 383 40 93

zfeher@hoganassessments.com

Recommended