What Are Tech Ethics?

Technologists develop solutions—but when emerging technology and social media dramatically alter how we learn, live and even die, who can address problems? It’s time we reflect on Tech Ethics: our approach to the implications and unintended consequences of developing and implementing technology.

Issues in Tech Ethics

Tech Ethicists address algorithmic bias, potential job losses through advancing technology, data privacy, the role of tech platforms in moderating speech, the ethics regarding autonomous vehicles and the ethical design of products/apps.

Stay on Top of Tech Ethics

Discover in-depth stories and resources for your course, and sign up to stay informed!

Big Tech Is in an Ethical Quagmire in Determining Allowable Speech

Big Tech appears to be caught between a rock and a hard place. When platforms such as Facebook and Twitter ban speech that is deemed inappropriate, they are accused of bias and censorship. And when the platforms keep up content that others believing is sowing discord, chaos and misinformation, the companies are accused of being neglectful. To make matters more complicated, tech companies must be careful not to be considered a publisher or media company, as doing so would trigger legal liability.

Questions for Your Course:

  • Is the use of algorithms to promote (or lessen) the popularity of speech a quasi-editorial decision?
  • Do we trust tech companies to make the types of speech determinations that have typically been determined by the legal system?
  • Should platforms be allowed to create whatever rules they’d like regarding speech, or should it coincide with our conception of the freedom of speech?

We Are Becoming Botified: Exploring the Ethics of Automating Our Online Communication

Are we talking to another human when we’re online? While there’s been a lot of discussion around bots running rampant online, there’s also the less noticeable problem of automated speech that is blurring the line between being human versus being bot. While automating communication may be advantageous from individual vs. platform perspectives (given that more data=more money), it may offend our notion of reciprocity in relationships—and sow confusion.

Questions for Your Course:

  • What is your ethical responsibility as a communicator around the transparency of your communication? For instance, would you notify a recipient of automated responses?
  • Platforms, such as Google (through Gmail) and LinkedIn have been heavily pushing automated tools, such as autocompleted sentences. What is their role in transparency?
  • We like to say, “It’s the thought that counts.” Automated communication is by nature thoughtful. Therefore, does it not count?

Discover More!

From human-like bots to preserving human relationships, learn more about becoming botified.

Exercise

Engage Now with this Educational Exercise

Let’s imagine that you and your significant other text each other every night to say, “I love you.” When your partner texts you the statement, “I love you,” there are likely only a few variations of responses that you typically write back. Would it be unethical to automate your response by setting an auto reply of “I love you!” back to your partner?

Is It Unethical to Have Robots That Look Like Humans?

Robots that look human tend to cause a mixture of awe and apprehension, triggering ethical concerns around how we should treat these machines and also how the machines may be altering our own behavior. If a robot looks human, should it be treated just like a human? A major concern revolves around the mistreatment of human-looking robots and how that may—or may not—influence our human-to-human interactions.

Questions for Your Course:

  • Would having a human-looking robot increase aggressive behavior, or would it serve as a harmless outlet? 

  • At what point would a human-looking robot deserve rights?

David Ryan Polgar

David Ryan Polgar

Tech Ethicist & Founder of All Tech Is Human

David Ryan Polgar is a pioneering Tech Ethicist who paved the way for the hotly-debated issues around Facebook, privacy, ethical design, digital wellbeing and what it means to be human in the digital age. He is often featured in the media, including CBS This Morning, Fast Company, SiriusXM, CNN.com, LA Times, Washington Post and many others. David is the founder of All Tech Is Human, an initiative to better align technology with the human interests of users, along with the co-host of the podcast & NYC live show Funny as Tech. He is a frequent writer (Quartz, IBM think Leaders, Dell Perspectives) and global speaker (3-time TEDx, The School of The New York Times, The Next Web). David is currently working on a digital citizenship class for adults (Skillshare). He is an advisor for Hack Mental Health and #ICANHELP.

WATCH INTERVIEW  VISIT DAVID'S WEBSITE

More from David

All Tech is Human

All Tech Is Human is an event that helps co-create a more thoughtful future towards technology.

Loading ...
Loading ...
Funny as Tech

Funny as Tech is a weekly podcast and regular live show tackling controversial issues in technology.

Loading ...
Loading ...

Tech Ethics Library

Tech Ethics Curricula: A Collection of Syllabi (Medium)
READ ›

Ethics in Technology Practice
A project by from Markkula Center for Applied Ethics at Santa Clara University with support from the Tech and Society Solutions Lab at Omidyar Network. Includes an overview, case studies, toolkit, slides, and more
LEARN MORE ›

Ethical Foundations Class Introduces Ethical Thinking to Computer Science Students
(The University of Texas at Austin)
READ ›

Ethically Aligned Design (IEEE)
Crowdsourced standards from various cross-disciplinary academics and leaders across the globe, this updated guide includes recommendations for policy makers, academics, and technologists.
DOWNLOAD GUIDE › 

Data Science Code of Professional Conduct
READ ›

10 Commandments of Computer Ethics (from the Computer Ethics Institute)
READ ›

The Interface, by Casey Newton
An evening newsletter about Facebook, social networks, and democracy.
READ ›

Tech Ethics in Action (Medium)
READ ›

Who should get to decide what’s ethical as technology advances? (The Next Web)
READ ›

Data for Democracy
“Data for Democracy brings together an active, passionate community of people using data to drive better decisions and improve the world in which we live.”
LEARN MORE ›

Data & Society
“Data & Society is a research institute focused on the social and cultural issues arising from data-centric technological development.”
LEARN MORE ›

COED:ETHICS (based in UK)
​“Resources & Ideas about coding ethics for developers and technologists.”
LEARN MORE ›

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
by Sara Wachter-Boettcher

You Are Not a Gadget: A Manifesto
by Jaron Lanier

Heartificial Intelligence: Embracing Our Humanity to Maximize Machines
by John C. Havens

Want More Food for Thought?

Stay up-to-date on articles, news and more on Tech Ethics.

Loading ...
Loading ...

Stay Up-to-Date

Thanks for your request. You’ll be chatting with Cengage in no time!
{{formPostErrorMessage.message}} [{{formPostErrorMessage.code}}]
First Name is required. 'First Name' must contain at least 0 characters 'First Name' cannot exceed 0 characters Please enter a valid First Name
Last Name is required. 'Last Name' must contain at least 0 characters 'Last Name' cannot exceed 0 characters Please enter a valid Last Name
Institution is required.
Discipline is required.
Cengage, at your service! How can we best meet your needs? is required.
Please share additional details about your request: is required. 'Please share additional details about your request:' must contain at least 0 characters 'Please share additional details about your request:' cannot exceed 0 characters Please enter a valid Please share additional details about your request: