“Industrialised hiring processes can often reward mindless exaggeration” according to the Economist. “Substance can matter less to recruiters than form”. That sounds plausible.
It concludes:
The incentives on both sides of the hiring process lean naturally towards glossing reality. If candidates were to give genuinely truthful answers (“I have a habit of making basic but calamitous errors”), many would rule themselves out of jobs. And if firms were to give a warts-and-all description of themselves, many would end up deterring good applicants. But a process designed to uncover the truth about job applicants would run a lot more smoothly if firms were also honest about themselves.
Fine words. People on all sides lie and exaggerate. This isn’t critical theory—it’s just life. Reality is routinely glossed over. To navigate it, one must learn when it’s appropriate to embellish the truth and when to present it plainly. After all, truth often boils down to some version of “we don’t know, but maybe we can try this.” What matters is phronesis: the wisdom to exercise good judgment with moral integrity. Life is a constant balancing act, requiring us to discern when to polish the edges and when to leave things unvarnished.
For example, in certain professional settings, with colleagues: don’t embellish, it’s obnoxious, a waste of people’s time. When the people at Google were building MapReduce, it would make sense to be as precise as possible in the internal technical discussions, not to gloss over the details. But the vast majority of jobs don’t involve this, they involve selling products externally or selling ideas internally. The instances of building MapReduce are rarer than one might think. Even in a technical or scientific undertaking, where accuracy is important, it’s necessary to shape reality into a compelling narrative. Why?
Because we don’t merely process facts. The raw truth might be accurate but rarely motivates; it’s the narrative that frames the truth in ways people can grasp and rally around. The question brings to mind David Hume’s famous “is-ought” problem1. Hume underscored the difficulty of moving from objective descriptions of what is to normative claims about what ought to be. This distinction is crucial: while the is describes reality as it exists—facts, processes, or observations—the ought reflects human values, goals, and prescriptions. The leap from one to the other is not straightforward.
When a job applicant lists their qualifications, they are (or it appears that they are) presenting facts (the ‘is’). They then make a leap to an ‘ought’, suggesting the employer should hire them based on these facts. The implication is: “I have accomplished all these things, I’m superior to the other candidates; therefore, you ought to choose me.” Would the job application processes be more effective if companies were ‘transparent’ about their own practices? If ‘firms were also honest about themselves’? These firms are already transparent and honest about themselves. Their version of honesty encompasses accepting a blend of fact and embellishment. This ability to effectively embellish, gloss, bullshit is a prized skill today. Being a conman and a fraud is, generally, what we value as a society; spreading false stuff and bullshit, exaggerating everything, is part of the unique value proposition of the modern-day successful individual.
Footnotes
Hume’s philosophy is complicated and I don’t adhere to his full metaphysical system, what Hilary Putnam calls “pictorial semantics” in The Collapse of the Fact/Value Dichotomy. I, like Putnam, think that there can be a “matter of fact” about what’s right or what’s virtuous or good. However, it doesn’t flow logically and seamlessly from the facts. Like Quine, I think there’s one “web of belief” (i.e., our understanding of the world). Our knowledge, including both scientific theories and moral judgments, is interconnected like a web. But inside that “web of belief”, I value folk psychology and common sense, and common sense tells me that the fact/ought distinction is useful.↩︎