I once attended a small scientific conference with a couple hundred attendees total. Over the two days of the conference there were many technical presentations to the full group, but two of the final presentations were non-technical. One presentation was from a science ethicist, who I believe was a professor at an Ivy-league school.
It pained me to be in the audience for the ethicist's talk and following Q&A session. She was unable to provide any framework for thinking about ethics which the scientists present did not already have. The scientists asked direct questions about how to approach specific ethical issues they saw in their own work, and her responses were all along the lines of "well, the important thing is that you think hard about it."
She displayed no ability to teach the scientists much, and no willingness to dive into discussion of their issues. Her presentation was a lot of the sort of thing that this article is trying to do: shaming STEM folks for their mistakes, and claiming the humanities is the cure-all. Let me make a claim: if the humanities are really a solution for problems created by STEM, all you should need to do is demonstrate the case. Technical and scientific crowds understand problems, and will jump on real solutions.
(Aside: of the STEM undergraduate programs I'm aware of in the US, they all require somewhere between 20% and 50% of classes to be in the humanities or social sciences)
> Shaming STEM folks for their mistakes, and claiming the humanities is the cure-all.
This is so not the article is saying. It basically says that the future world leaders will be among computer science folks, and we need more people to get into STEM, but we should also feed them with some humanities education as well. Those are completely different things. There is no shaming nor fanatical claims. They just want STEM graduates to have more holistic point of views.
> She displayed no ability to teach the scientists much, ...
You can't attack the premise of the article by sharing a single anecdote from your past which only shows the inability of the person in question of defending her arguments properly in that day and location. You can't generalize that and use it to attack the worth of humanities. It is ridiculous.
> Technical and scientific crowds understand problems, and will jump on real solutions.
Not so easy. A "problem" is not always a purely technical thing. Most of the time a problem needs to be analyzed from a humanities perspective as well as a pure technical perspective before jumping to the solving part. For instance:
- What if the problem is not worth solving?
- What if the "real" solution is malevolent?
- What if the problem is not really a problem?
- What if the human factors regarding the problem somehow necessiate alternative solutions than the most obvious, efficient, or "real" one?
...
In the end, I found your comments to be too defensive. Nobody is attacking STEM people. Take it easy.
Closing question:
Is it a coincidence that we are heading to a technological dystopia and it is being implemented by the best engineers in the world? What is missing?
Sorry if I didn't make it clear, I was not trying to "attack" the article. I think my anecdote was relevant but my main aim was to share an event I experienced, not to win an argument or craft a straw man. I do think there was a real problem in the event I described, and anyone who has the goal of helping STEM fields needs to avoid those specific pitfalls. You are arguing past me.
Thanks for the clarification. I guess I was a bit defensive myself as well, being a former humanities guy and a current engineer.
I was just really bothered by the statement "Her presentation was a lot of the sort of thing that this article is trying to do: shaming STEM folks ...", as I didn't think at all that this was what the article is trying to do.
"well, the important thing is that you think hard about it" might not mean what you think it means.
Often, especially (in my experience) when communicated to someone in an engineering profession, that's an exhortation to not automatically assume the applicability of the listener's pre-existing ethical/methodological frameworks to the problem at hand. In other words, to consider something with minimized bias. Bias isn't always bigotry or a tendency towards specific solutions--it can also manifest as a pre-disposition towards previous "one-size-fits-all" deontological tools, which do apply to many situations, but may not yield comprehension, empathy, or minimized negative consequence when applied to a given problem.
Examples:
1. Engineer: "I was asked if it's possible to build $intrusive_privacy_invading_product so I provided an answer by building it". Critic: "Could you construct it in a way that harms to privacy are minimized?" A mis-chosen framework there might be "exercising craftsmanship or capability is its own reward", or absolute meritocracy.
2. Liberal: "Gun ownership is harmful therefore should be banned." Critic: "Can you accomplish your goal in a way that allows people that genuinely enjoy guns to continue to do so? Your approach risks polarization at best and outright failure at worst." A mis-chosen framework there might be utilitarianism, or "short term political ends are the primary focus".
3. Conservative: "Regulations are bad, because they inherently restrict liberty." Critic: "Are some liberties more important than others? Your approach risks throwing out the baby with the bathwater." Mis-chosen frameworks there might be focus on influential historical figures' political realities, or individualism.
None of the frameworks chosen above are bad, inherently flawed (by their own moral compasses, of course), or inapplicable. However, they might be less useful as decisionmaking tools than some others.
Nobody can step all the way outside of their chosen deontology (can you use that as a noun? I hope so), but often people who ask that you "just take time to think" are requesting you to at least try.
I do appreciate your broader point. But in the talk I'm describing, it was the ethnicist who lacked the empathy and comprehension to understand the scientists' concerns about dilemmas in their own work. There was no discussion of bias, even where that would have been appropriate. Communication to technical audiences needs to be done with a higher level of quality.
> Let me make a claim: if the humanities are really a solution for problems created by STEM, all you should need to do is demonstrate the case. Technical and scientific crowds understand problems, and will jump on real solutions.
Technical and scientific crowds understand technical problems. But non-technical problems go completely unnoticed.
Or as one of the actors on the show Silicon Valley wrote:
> As a cast member on a show about tech, our job entails visiting tech companies/conferences etc. We meet ppl eager to show off new tech.
> And they're amazing, wonderful, necessary products, right?
> Often we'll see tech that is scary. I don't mean weapons etc. I mean altering video, tech that violates privacy, stuff w obv ethical issues.
> And we'll bring up our concerns to them.
> They don't even have a pat rehearsed answer. They are shocked at being asked. Which means nobody is asking those questions.
I make no claim as to whether competent science ethicists exist. Clearly this anecdote is a good example of what professors of humanities should try to avoid if they aim to fix problems in STEM.
Most humanities do not understand or even argue against the concepts of statistics. It might well be that she even lacked the capabilities to make an argument like you just made.
Could you please discuss more about what the point of your comment is?
To me it seems like you're implying that studying ethics is pointless because you or others didn't get anything out of their talk. Is that what you actually meant?
If there's anything I'm trying to imply, it's that there is a problem that needs to be addressed first: communication of ethics to a technical crowd can and should be done in a more professional and constructive way. Ideally my bad experience won't be repeated too many times again. Otherwise you should expect more of the more negative comments here about "wrongthink".
They keep on insisting this is the case but they never give reasons why and spend all of their time on the offensive. Cynically it looks like they want in on the action and are trying to extort what they want via character attacks instead of any merit. Like courting lobbyists. All while giving vague and unsubstantiated hype. Nobody needs such parasitic manipulators.
Even the example is a terrible one as it shows a lack of technical knowledge and hype susceptibility. Machine learning is vulnerable to all about spurious correlations - that isn't lack of humanities but not understanding how to apply statistics and the limitations of the tools. Despite what statistics say Nicolas Cage doesn't drown people.
Humanities knowledge is just a tool which may and may not be used for good. FAANG already hires tons of humanities and social science PhDs to understand user behaviors. Whether that's used for good or ill depends on what the corporations are incentivized to do, not on the number of humanities grads employed.
(I'm speaking as a social science PhD employed at FAANG.)
So the problem of the blindspots in STEM focused people and the fact that his leads them to just unthinkingly doing what they are told to do ... isn't a problem? Because it depends on what they are being told to do?
I don't buy that, more humanities knowledge distributed among everyone is clearly needed, because our system ensures that the people at the top are more likely to use that knowledge for ill and the lower ranks knowing what is going on can be a check on that.
Many of the people who go into STEM do so because they want to deal with the task put in front of them without it being complicated. They don't want to deal with a bunch of icky emotions, and they don't want to trip themselves up asking murky existential questions about why they're doing what they do. This is a condition of temperament, you can run them through a gristle mill of humanities classes, it doesn't mean they'll internalize it if it isn't what they're inclined towards. You can lead a horse to knowledge, but you can't make it think.
I think your characterization of people who go into STEM is detracting from what could be an otherwise compelling argument. There are stronger reasons why people choose STEM. I agree that adding a few ethics classes won't change ethical behavior outcomes for those who would have chosen STEM programs anyway.
I'm not as certain. I went into college full-on STEMLord type person. Humanities was useless, STEM was the only way forward, etc etc. Graduated much the opposite, wanting to see more people appreciate the Humanities and incorporate them into their work and such.
And it all happened because I took some interesting humanities courses, and had some fun teachers. My two philosophy ones -- a general introduction, then a Philosophy of Science course taught by a person who had actually gotten their undergraduate in biophysics -- really led to some interesting discussion, and got me reading deeper and more across the humanities. Now, that's most of my reading, it seems.
While it may be a problem that people „unthinkingly do what they are told to do“, I don‘t think requiring engineers to take an ethics class will fix that.
What it does is give people the tools to properly articulate thoughts they are already having. It won‘t make someone an activist who just doesn‘t care.
That isn't a STEM thing by any means. It is a societal and organizational thing. When their job and/or life depends on it most people do the same thing.
A co-worker (and a great friend) of mine gave a lecture at University of Nebraska–Lincoln, on bridging the gap between the liberal arts and the tech industry.
He talks about the value that liberal arts can provide the tech industry. In my opinion he is also giving rhetoric against this notion that you are tied to working in the same field that you studied at school, and that you have to follow the "correct" path that other people tell you is "marketable" for your career.
We both studied liberal arts, and we both work together in tech. He got his PhD in Classical literature, and works on the strategy team. I am in my final semester for getting a BA in Philosophy, and I'm on the engineering team.
If you listen to the lecture, around the 28min mark he says "I asked him, do you regret majoring in Philosophy?" My response is still the same now as it was when he asked me. I don't regret it, and I don't regret choosing it over computer science or a related major. I believe having the philosophy background helps me live a better life, and helps me do my job well.
Ask certain academics in the philosophy department what exactly philosophy is, and they'll tell you something like: "the systematic testing of our rational beliefs about the world." This to me, is where philosophy and software engineering intersect.
Well of course you don't regret majoring in philosophy, because your outcome is good (you are employed). I wonder what the other folks who had much more difficulty getting jobs would recommend.
The lack of humanities seems to be mainly due to competitive factors and employers reducing risk. Assuming all other factors are equal, if you had only one spot to hire an entry-level software engineer, who would you take the risk on? The CS major or the philosophy major?
When their real goal is to "tackle misinformation" they should abolish any thinking prevalent in the humanities.
If you want pure objective truth you need logic and evidence. Basically the tools that are employed by Math, Physics and Engineers. The concepts of Classical Positivism are over 100 years old and are still the best thing we got to find out the truth.
In order to "tackle misinformation" humanities have zero solutions to offer. Any mediocre engineer would be better.
A lot of the world's problems are just too complex and interconnected to have an accessible source of definitive truth. For example anything to do with pollution or major construction projects. You have to balance competing demands that are absurdly tangential, that should be a false dichotomy. But those are the choices that have to be made all the time and there is no analysis that can give you the answer.
Science and engineering strip out complexity to reach something simple enough to be modelled. It is naïve to believe that every problem can be treated like this.
> It is naïve to believe that every problem can be treated like this.
Is it really? I can make abstract models of all kinds of things and still get accurate predictions. The movement of single atoms in a gas is very "complex". Still thermodynamics can make precise prediction about pressure, temperature and volume of that gas.
Of course you always have to check if your model is accurate and if the abstraction does not take a very important part of reality in consideration.
A scientific predication is not enough to make a decision in these kind of cases.
For example, consider the building of a wind farm...
* The site is on the migration path of a species of endangered goose. You can model the number likely to be killed.
* Climate change is a problem. You can model the effect on the atmosphere of CO2 emissions.
* The turbines will be visible from a beauty spot and could effect tourism. You can model where the turbines are visible.
* People think that turbines are really ugly. You can quantity this with a questionnaire.
Lots of scientific predication and analysis going on here. But none of that tells you what you should actually do about these issues. It doesn't tell you how many geese you are allowed to kill to reduce CO2 Any more than science can be tell you what flavor of ice cream to eat. It is a question of morality, geography, asthetics, sociology and politics. Why wouldn't you want to understand those things?
If I know all of the above mentioned quantifiable objective variables with a reasonable accuracy I can make a much more informed decision.
Again, why do I need humanities for that decision?
What are the solutions they have to offer?
How can they do anything else than to say "You have to do it that way, because we say so"?
If they say "this solution is morally right, ascetically beautiful, sociologically correct and politically desirable" aren't they imposing their own subjective unverifiable "truths" onto others?
Just as having an in depth understanding of mathematics, physics, or engineering helps us to make more informed decisions, so too do the humanities. Fields of study such as philosophy, sociology, or aesthetics do not attempt to establish, as you put it, subjective unverifiable "truths" (to do so would be quite infeasible in the first place). Instead they allow for discussion and aggregation of information and reasoned opinions, so that a comprehensive understanding of the field and its many schools of thought may be reached.
Coming back to the wind farm, even once we have the objective variables, the decision we make will be based on factors of morality, geography, aesthetics, sociology and politics. By having a broad understanding of these fields and the various paradigms contained within them, we can produce a more informed decision than if these aspect were ignored completely.
In a problem that lacks definitive truth you have to apply subjective thinking. That is all you have left. And the wind farm is a good example of that. There is no scientific truth that means we have to care about threatened species, or the effect on tourism, or even climate change. Deciding to care about those things is a subjective decision. If you look at things from a higher level of abstraction the certainty just fades away.
I don't see the point in defending the humanities as some utilitarian tool. It is always a good idea to become more educated, and to try and understand the world a little beter from different perspectives.
> I can make abstract models of all kinds of things and still get accurate predictions.
If you think that's working for you in dealing with human emotions, well...
How many non-STEM people have you persuaded to change their lives for the better with that approach?
For that matter, how many STEM people's lives have you impacted meaningfully that way? I'd guess more, but still not very many.
> ...How are humanities helping me with that?
I studied humanities intensely until I reached college. I was a semi-pro flutist for several years after, a good enough pianist that I could have gotten into music programs, and I was (and am) a good reader, world-builder, and editor (mainly of fiction, but sometimes non-fiction as well).
All that was, in fact, spectacular training for software development, as it taught me several lessons that many programmers I have encountered do not seem to have learned:
- You cannot understand anything meaningful perfectly.
- Nothing complex can be perfectly rigorously formalized by the human mind. The closer you get, the smaller the percentage of humans that can and will understand it becomes.
- A well-trained intuition for metaphor and symbolism can be a spectacular workaround for the problems presented by the previous two points. Many great artists don't realize the full depth of the symbolism and metaphors they themselves have created until their sharpest audience members help them see it, and that richness can communicate things not even the author fully understands. As an acquaintance whose name is escaping me once said, "Poetry is fundamentally about those things which cannot be named."
- Technical skills like software development or structural engineering, while they take some time and a reductionist mindset to learn, are not the hard part. I had absolutely no native talent for software and while I'll never be world-class, I'm pretty useful in a lot of contexts now. You can just learn it, unless you have biological problems with your intellect.
What's hard is seeing the complexity in this world as it truly is, understanding how far beyond your grasp it is, seeing how blindingly, dizzyingly beatiful it is despite the broken shards strewn throughout reality's nature, and finding ways to actually, genuinely take care of other people, without doing more harm than good due to inattention, arrogance, excessive reductionism, or any of the thousand other failings that plague us.
Humanities, especially music, taught me that technique is not the hard part by showing me starkly that blisteringly difficult technical exercises are often stultifyingly boring. When a field's whole point is aesthetic effect, it's far easier to see the ways in which technique matters and the ways in which it doesn't.
...this little rant is poorly written and structured, for which I am sorry. I have small children, so the amount of time I have to indulge in the lesser-paying pursuits I love, such as writing reasonably coherently about the humanities, is sadly limited.
In fact, I considered just deleting my response, because it's not a very good piece for its intended audience, but maybe it gets a little or what's in my head across.
In summary: I heartily agree that tech could use more humanities influence, especially software. I suggest particularly to other humanities-type creatives that software is a field that is fundamentally creative but which is also lucrative, due to it taking some work to learn, and it therefore makes a great day job.
> In order to "tackle misinformation" humanities have zero solutions to offer. Any mediocre engineer would be better.
Engineers are just as prone to misinformation as any other field. They just have more trouble recognizing it, due to a lack of training, and a mistaken belief they are immune.
> Engineers are just as prone to misinformation as any other field.
I would argue that this is exactly not the case because they know how to employ the scientific method. If humanities make all kinds of bold claims without providing logic and evidence engineers won't listen for very long. That is probably true.
The problem is that "tackle misinformation" and "ethics" get conflated here. If you want to know the truth you have to employ the scientific method.
Ethics on the other hand only contain very few "truths" and are mostly social convention. E.g. In some places of the world it is ethically perfectly correct to kill you daughter if she commits adultery. However there are some eternal truths that are correct for all different ethics out there, like the prisoner dilemma, contract/game theory and the fact that might is always right.
The latter eternal truth is the one that are interesting and worth discussing. However the former parts of ethics are the ones discussed by humanities. Hence from the viewpoint of an engineer useless.
How do you know if your boss is mad at you? Or if someone is interested in you romantically? Or if a landscape is beautiful or ugly?
Not all knowledge is gained through empirical methods. We know that because people had knowledge, including sophisticated technology, before there was a scientific method, i.e. before the 16th century. Most of our knowledge is gain through interpretation of the world...basically, doing the various things our minds do to arrive at conclusions. In the humanities, we (I am a humanist though also a programmer) try to hone our interpretive ability. We have a bunch of fancy names for ways to do that, like hermeneutics and dialectic and rhetoric, but basically you do it through practice.
There's bad interpretive work, just like there's bad scientific work. But thinking empiricism is the be-all of knowledge is kind of silly. Do you use the scientific method to decide if you should kick a dog or pet it? Whether you beleive a politician's statement? Wheter a newspaper is trustworthy? No, you go through a process of modelling that other consciousness. That's interpretation.
Those three are in fact all partially or fully learned behavior. Anyone on the autistic spectrum who learned coping skills is acutely aware that those are observational given their "driver failure" calls for doing it manually. Standards of taste are learned and beauty does have mathematical elements like phi.
I knew someone was going to be like "That stuff is observational and therefore empirical!" If that's the case then the humanities is a science because we're doing empirical research on books. Often empirical research on individual sentences!
Mozilla has been on a role lately of making statements that make me skeptical about the future of their browser. I just don't feel like every software I use needs to have a robust social policy behind it. I'm happy to just use something that works well
This is such an important point. Engineering does so much to shape society and the lives of individuals. But we give zero training to new engineers in these issues. Worse, so many engineers I speak to are willfully ignorant or disdainful of the humanities.
Engineers mustn't loose sight of who we work for: humanity.
Ideally the humanities forces you to think about values and the why of things, not just the how. Not all goals are worth achieving...you can "change the world" and make it a worse place.
More practically, the humanities teaches persuasion and rhetoric, in writing and in speaking, as well as reading between the lines when reading or listening. I work with engineers...kind of am one myself, though I was trained as a humanist...and I see them dashing themselves on those rocks on a weekly basis.
Every ABET accredited engineering school in the US is required to give a class on professional ethics as part of the degree. Doesn't seem to have much effect.
I'd much rather see engineers trained in law - it's considerably more practical and I'd wager will lead to far more positive outcomes than general humanities
Most modern legal frameworks already incorporate the better structured part of the humanities (philosophy & ethics).
However, none of this matters when we live in a hierarchical society where most people just do not exist in a bubble where they can dissent on any and all things they're working on without fear of reprisal. The fact that Google employees were able to speak-up about bidding on government contracts is huge, but that's a luxury most workers don't have.
> "Technology companies need to diversify their hiring practices to include more people from backgrounds in philosophy and psychology if they want to tackle the problem of misinformation online, the head of one of the biggest internet charities has warned."
If it's misinformation, shouldn't intelligent people of any background be able to identify and flag it?
> "we are intentionally building the next generation of technologists who have not even the framework or the education or vocabulary to think about the relationship of Stem to society or humans or life."
Oh. Because it's not about truth or critical thinking but frameworks and vocabulary.
As a veteran programmer, I'd say that on average, engineers in the Bay Area do need an attitude adjustment. There are a lot of awesome people here, and bad behavior is an exception, but I've seen more douchey entitled behavior from Bay Area programmers than I have in the entire rest of my professional career.
Another thing I seen too often from programmers, is a disdainful, even disgusted attitude when faced with negative feedback. The first reaction should be to try and understand where the feedback is coming from, not to disdain it. What I've seen too often, is a defensive retreat behind the idea that the feedback is coming from an idiot. The thing to remember about online idiots, is that a) most people aren't them, most of the time and b) even when someone is that, they are usually far different in real life.
The feedback typically comes from people who think they are smarter than the engineers, or who have more social capital and so are used to be looked up to and respected. As such they get pissy when nerds question them.
Nerds tend to accept good solid feedback when a good reason is given; nerds rightly disregard feedback that is about pleasing peoples egos. Really smart nerds do strike a balance, but egos should be considered a flaw, and worked around as any other flaw.
The feedback typically comes from people who think they are smarter than the engineers
I know exactly the kind of feedback of which you speak. I once hired a tester to test a 2D game with Asteroids movement mechanics. His feedback: "Where are the brakes? It doesn't go where I point it." Well, that's sort of the point of the Asteroids movement mechanic, and my tester didn't even bother to understand the point.
Nerds tend to accept good solid feedback when a good reason is given;
In my experience, not always. I've had a number of experiences where I explain my background as a fellow programmer, and I'm doing my level best at explaining in depth how I perceive a design as a user and why I'm reacting, and all too often, instead of getting clarifying questions or a reflection-back, I immediately get the start of some 'splainin. Conclusion jumped to after no iterations, and apparently after only a simple pattern-match. One time, I'd started explaining my reactions as a user, then got back a reply about the codebase. There were lots of changes to the UX, yet the major "plusses" for the change were from the POV of the programmer, and as a user, I'm left wondering, "What's in it for me?"
Really smart nerds do strike a balance, but egos should be considered a flaw, and worked around as any other flaw.
Most nerds never get past the point of only considering the egos of others to be a flaw.
I think that is a regional thing period - they exist everywhere but the concentration seems extra high. Look at the people living near the Hollywood sign who complain about tourists, complain of problems other areas would kill for and more. There unfortunately isn't a shortage of bad attitudes from everyone - rich or poor.
Is it the programmers/engineers that make the "incorrect" value judgements? It seems to me that it's usually the managers who are responsible for ethical gaps, and engineers are caught up in the act. That's the exact situation in which I would understand disdainful attitude.
For examples of the consequences of big data and ML, check out mathematician Cathy O'Neill's book "Weapons of Math Destruction". She posits that the problematic systems share three key features: they are opaque, unregulated and difficult to contest, and at the same time scalable, thereby amplifying any inherent biases to affect increasingly larger populations.
There are plenty of ways you could improve a random engineer. One for many guys would be to have played a team sport, or been in a marching band. Or something else outside the scholastic bubble.
Another would be mandatory physical education in college.
I’d also posit it would be helpful for larger tech firms and most accelerators to have an ethecist on staff, to help with the “could we vs should we” evaluation.
For this to work the ethecist would have to be well versed in philosophy. In our current political climate I would suspect that anyone who obtained this position would be anything but.
It would definitely be helpful from a corporate strategy perspective, but I think the real solution is educating engineers in the humanities or liberal arts.
I argue it nudges in the right direction. I agree you can't force someone to be moral anymore than you can force them to be decent or civil. That's what parents, free will, and introspection are for.
It's not just the metaphorical "tech" that suffers, it's also the candidates themselves.
When you have empathy and you prefer thinking about your impact (on customers, on the environment, on current culture, etc), what is open to you is a long and arduous road. Applying to companies, the choice is limited. Creating products, there's extra inertia. Acquiring experience with these constraints is also mentally draining at times. Self-image can also take a hit, when your beliefs have you refuse easy ways towards prestige or money. You might walk with your head held high, but not necessarily have the savings or the opportunities that your friends have. There's a social cost to "doing the right thing" that not everyone is willing or able to pay.
It takes a bit of luck to find the right places and projects, but to anyone out there facing similar struggles: continue believing in yourself. The system will come around eventually..
I just find it interesting how they seem to find AI to be the issue half of the time when it just reflects what it's shown. The example near the end just reflects their internal process give or take mistakes in the initial design and method used.
In the US, there is a big problem with trying to get more balance in the STEM fields by introducing humanities education, in that the humanities in higher education are massively skewed to the left.
I'm not sure whether this is aimed at the Zuckerbergs of this world, or an attempt to shoehorn (yet more) leftist viewpoints into tech. Personally I'm just going to continue ploughing my autistic tech furrow until the entryists hopefully move on.
A) Tech already hires psychologists to optimize mark fleecing.
B) This is rich, coming from an exec at a "nonprofit" that gets most of its money from Google.
C) Spend more time around humanities professors--no less ruthless than the engineers, but they can pull quotes from Foucault and Derrida to justify their own assholery. A less enlightened engineer would just have to accept that about himself.
It pained me to be in the audience for the ethicist's talk and following Q&A session. She was unable to provide any framework for thinking about ethics which the scientists present did not already have. The scientists asked direct questions about how to approach specific ethical issues they saw in their own work, and her responses were all along the lines of "well, the important thing is that you think hard about it."
She displayed no ability to teach the scientists much, and no willingness to dive into discussion of their issues. Her presentation was a lot of the sort of thing that this article is trying to do: shaming STEM folks for their mistakes, and claiming the humanities is the cure-all. Let me make a claim: if the humanities are really a solution for problems created by STEM, all you should need to do is demonstrate the case. Technical and scientific crowds understand problems, and will jump on real solutions.
(Aside: of the STEM undergraduate programs I'm aware of in the US, they all require somewhere between 20% and 50% of classes to be in the humanities or social sciences)