## SciPost Submission Page

# First-order transition in a model of prestige bias

### by Brian Skinner

#### This is not the current version.

### Submission summary

As Contributors: | Brian Skinner |

Arxiv Link: | https://arxiv.org/abs/1910.05813v2 (pdf) |

Date submitted: | 2019-10-31 |

Submitted by: | Skinner, Brian |

Submitted to: | SciPost Physics |

Discipline: | Physics |

Subject area: | Statistical and Soft Matter Physics |

Approach: | Theoretical |

### Abstract

One of the major benefits of belonging to a prestigious group is that it affects the way you are viewed by others. Here I use a simple mathematical model to explore the implications of this "prestige bias" when candidates undergo repeated rounds of evaluation. In the model, candidates who are evaluated most highly are admitted to a "prestige class", and their membership biases future rounds of evaluation in their favor. I use the language of Bayesian inference to describe this bias, and show that it can lead to a runaway effect in which the weight given to the prior expectation associated with a candidate's class becomes stronger with each round. Most dramatically, the strength of the prestige bias after many rounds undergoes a first-order transition as a function of the precision of the examination on which the evaluation is based.

### Ontology / Topics

See full Ontology or Topics database.###### Current status:

### Submission & Refereeing History

## Reports on this Submission

### Anonymous Report 1 on 2019-12-9 Invited Report

- Cite as: Anonymous, Report on arXiv:1910.05813v2, delivered 2019-12-09, doi: 10.21468/SciPost.Report.1379

### Strengths

1. This simple agent-based model shows how the interplay between prior bias and Bayesian updates lead to either no estimation error of the quality of people, or are not able to remove the influence of prior bias.

2. This submission is generally written in a clear way and leaves food for thought.

### Weaknesses

1. That said, I regret the wording "precision" for p: a high precision means a small p, which is very confusing, especially since it is the crucial parameter of the model, and this leads to unnecessary convoluted mental process. The paper proposes to think as power as the inverse of precision. Why not using power instead in the discussion, then?

### Report

Last question: is there an intuitive reason for the fact that $p_c=1/\sqrt{3}$?

### Requested changes

precision -> power ?

This simple agent-based model shows how the interplay between prior bias and Bayesian updates lead to either no estimation error of the quality of people, or are not able to remove the influence of prior bias.

This submission is generally written in a clear way and leaves food for thought.

That said, I regret the wording "precision" for p: a high precision means a small p, which is very confusing, especially since it is the crucial parameter of the model, and this leads to unnecessary convoluted mental process. The paper proposes to think as power as the inverse of precision. Why not using power instead in the discussion, then?

Last question: is there an intuitive reason for the fact that $p_c=1/\sqrt{3}$?