Chapter IS-4

ETHICAL, LEGAL, AND SOCIAL CONSIDERATIONS IN INFORMATION SYSTEMS

Topics Covered:

  • Privacy and personal information in the digital age
  • Privacy laws and regulations (GDPR, HIPAA, FERPA, CCPA)
  • Ethical issues in information systems
  • Intellectual property rights and challenges
  • Censorship and internet freedom
  • Green computing and sustainability
  • Artificial intelligence ethics and emerging regulations

In today’s interconnected world, information systems touch nearly every aspect of our personal and professional lives. With this increased reliance on technology comes significant responsibility. This chapter examines the ethical, legal, and social challenges organizations face as they implement and manage information systems.

4.1 Privacy in the Digital Age

Privacy is “assurance that the confidentiality of, and access to, certain information about an entity is protected” (Computer Security Resource Center, n.d.)

The Value of Personal Information

Generally, when personal information is collected as part of doing business, it belongs to the company. However, privacy laws increasingly grant individuals rights over their data.

Table 4.1: Common Types of Personal Data Collected by Organizations

Data CategoryExamplesCommon Business UsesRecent Concerns
Identity InformationName, address, date of birth, Social Security numberAccount creation, verification, complianceIdentity theft from breaches
Contact InformationEmail, phone number, mailing addressCustomer communication, marketingSpam, phishing attacks
Financial InformationCredit card numbers, bank accounts, transaction historyPayment processing, credit decisionsFinancial fraud
Behavioral DataPurchase history, browsing patterns, search queriesPersonalized recommendations, targeted adsManipulation, privacy invasion
Location DataGPS coordinates, IP addresses, check-insService delivery, local marketingTracking, surveillance
Biometric DataFingerprints, facial recognition, voice patternsSecurity authentication, identity verificationBIPA violations, misidentification
Health InformationMedical records, prescriptions, fitness dataHealthcare delivery, insuranceReproductive health tracking concerns
Social DataFriend networks, posts, likes, messagesSocial features, content recommendationsMental health impacts, manipulation

Who owns this information? When personal information is collected as part of doing business, it belongs to the company. However, privacy laws increasingly grant individuals rights over their data.

Key Privacy Challenges

Social Networking Risks: Your social media presence can affect employment and insurance. In 2024, 70% of employers reported using social media to screen candidates (CareerBuilder, 2024).

Workplace Monitoring: Companies increasingly use AI-powered tools to monitor employees—tracking keystrokes, analyzing email sentiments, and monitoring video feeds.

Biometric Data Collection: Facial recognition and fingerprint scanning are now common. In 2024, Illinois residents won a $650 million settlement from Meta for violating the state’s Biometric Information Privacy Act (New York Times, 2023).

Best Practices for Privacy Protection

Different regions have developed varying approaches to privacy regulation. Understanding these frameworks is essential for businesses operating in global markets.

4.2 Privacy Laws and Regulations

Different regions have developed varying approaches to privacy regulation. Understanding these frameworks is essential for businesses operating in global markets.

Constitutional Foundation: The Fourth Amendment

Privacy rights in the United States are rooted in the Fourth Amendment, which protects against unreasonable searches and seizures. While the amendment doesn’t explicitly mention ‘privacy,’ courts have interpreted it as establishing fundamental privacy rights, particularly against government intrusion (U.S. Const. amend. IV).

The Third-Party Doctrine: Information voluntarily shared with third parties (like emails with Google) loses some Fourth Amendment protection. However, in Carpenter v. United States (2018), the Supreme Court recognized limits to this doctrine in the digital age.

Table 4.2: Major Privacy Regulations Comparison

RegulationJurisdictionEffective DateScopeKey ProtectionsMaximum Penalties
GDPR (General Data Protection Regulation)European UnionMay 2018All personal data of EU citizens/residentsConsent requirements, right to access, right to erasure, data portability, 72-hour breach notification€20M or 4% of global annual revenue
HIPAA (Health Insurance Portability and Accountability Act)United StatesApril 2003Protected health information (PHI) held by covered entitiesPrivacy and security of medical records, patient rights to access records, breach notification$1.5M per violation category per year
FERPA (Family Educational Rights and Privacy Act)United States1974 (amended 2002)Education records of students at institutions receiving federal fundingStudent/parent rights to access records, consent required for disclosure, right to request correctionsLoss of federal funding
COPPA (Children’s Online Privacy Protection Act)United StatesApril 2000Online collection of personal information from children under 13Verifiable parental consent required, clear privacy policies, data security requirements$50,120 per violation (adjusted for inflation)
FACTA (Fair and Accurate Credit Transactions Act)United StatesDecember 2003Consumer credit informationFree annual credit reports, identity theft protection, proper disposal of consumer information, fraud alertsFTC enforcement with civil penalties

The U.S. Sectoral Approach

Unlike the EU’s comprehensive GDPR, the United States employs a sectoral approach—different laws for different industries. Most social media, smart home devices, and mobile apps are not covered by federal privacy law.

State of Alabama Privacy Laws:

✓ Active Privacy Laws:

  • Alabama Genetic Data Privacy Act (2024) – Protects genetic testing data from companies like 23andMe.
  • Data Breach Notification Act (2018) – Requires 45-day breach notifications.

Pending Legislation:

  • HB 283 (Personal Data Protection Act) – Failed in 2025; likely to be reintroduced.
  • HB 449 (Genetic Theft Criminalization) – Makes unauthorized DNA sales a felony (Alabama HB449, 2025)

NOT Currently Protected:

  • Social media data
  • Smart home devices
  • Mobile app data
  • E-commerce behavioral tracking

University of North Alabama Privacy Policy:

UNA implements security safeguards, does not sell visitor information, and releases personally identifiable data to third parties only with permission or as required by law, though it acknowledges it cannot guarantee complete data security (University of North Alabama, 2020).

4.3 Ethical Issues in Information Systems

Beyond legal compliance, organizations face ethical dilemmas that require careful consideration.

The Ten Commandments of Computer Ethics

In 1992, the Computer Ethics Institute developed a framework that remains relevant today. These principles provide guidance for ethical behavior in the digital age:

Table 4.5: The Ten Commandments of Computer Ethics

Digital Ethics Challenges

Algorithmic Bias: AI systems can perpetuate discrimination. In 2023, AI hiring tools showed bias against qualified minority candidates. In 2024, several companies faced civil rights lawsuits over their AI hiring tools (MIT Technology Review, 2023).

Surveillance Capitalism: Business models based on extracting maximum personal data and using it to manipulate behavior raise concerns under Commandments 1 and 10.

Misinformation and Deepfakes: AI-generated content can create convincing fake videos and audio. During 2024 primary elections, AI-generated robocalls impersonating candidates were used to mislead voters, prompting the FCC to declare such calls illegal. (Cybersecurity Ventures, 2024).

Content Moderation and Censorship

Platform Responsibility: Social media companies face pressure to remove harmful content while preserving free speech. Different countries have different standards.

Government Censorship: Countries like China, Russia, and Iran extensively censor internet access. The “Great Firewall” of China blocks access to Facebook, Twitter, YouTube, and many other services.

Net Neutrality: The principle that ISPs should treat all data equally, without discriminating based on user, content, or service. Restored by FCC in 2024 after years of debate. (FCC, 2024).

4.4 Intellectual Property in the Digital Age

Intellectual property protects creations of the mind. Digital technology has created new challenges for IP protection.

Copyright: Protects creative works including software, music, videos, and written content. The Copyright Act of 1976 (amended 1980) covers computer programs.

AI and Copyright: AI and Copyright—Major lawsuits question whether AI companies violated copyright by training models on copyrighted works without permission. Who owns AI-generated content remains unclear. Music streaming pays artists $0.003-0.005 per stream, raising fairness concerns.

Patents and Trade Secrets: Protect inventions and proprietary business information. Tech companies file thousands of patents annually. Patent trolls abuse the system.

Trademarks: Protect brand identity through names, logos, and symbols. Social media has created challenges with impersonation accounts and counterfeit product sales.

Fair Use Considerations

Fair use allows limited use of copyrighted material for criticism, commentary, news, teaching, and research. Four factors determine fair use:

  • Purpose and character of use (commercial vs. educational)
  • Nature of the copyrighted work
  • Amount and substantiality of portions used.
  • Effect on market value of the original

4.5 Green Computing and Environmental Responsibility

Information technology has significant environmental impacts that organizations must address.

Table 4.3: Green Computing Strategies and Benefits

StrategyImplementation ApproachEnvironmental BenefitBusiness Benefit
VirtualizationRun multiple virtual servers on single physical machinesReduces hardware needs and energy consumptionLower capital costs, easier maintenance
Cloud MigrationMove systems to efficient cloud providersShared infrastructure reduces overall energy useScalability, reduced IT overhead
Remote WorkEnable telecommuting policiesReduces commuting emissionsLarger talent pool, lower office costs
Energy-Efficient HardwareChoose ENERGY STAR certified devicesLower power consumptionReduced electricity bills
Extended Device LifeRepair and upgrade rather than replaceLess e-waste, fewer resources extractedLower replacement costs
Renewable EnergyPower data centers with solar/windZero-emission electricity generationPrice stability, positive PR
E-Waste RecyclingPartner with certified recyclersProper disposal of hazardous materialsRegulatory compliance, community goodwill
Power ManagementEnable sleep modes, optimize coolingReduced energy consumption20-40% lower energy costs

Business Benefits: Beyond environmental responsibility, green computing reduces energy costs (20-40% savings), improves brand reputation, attracts environmentally conscious customers and employees, and ensures regulatory compliance.

4.6 Artificial Intelligence: Ethics and Emerging Regulation

AI represents one of the most significant ethical and legal challenges facing organizations today.

Key AI Ethics Issues

Bias and Fairness: AI systems can perpetuate discrimination in hiring, lending, criminal justice, and healthcare. Amazon abandoned an AI recruiting tool that discriminated against women. In 2024, companies faced lawsuits alleging their AI hiring tools violated civil rights laws.

Transparency and Explainability: Many AI systems are “black boxes”—even creators can’t fully explain individual decisions. This creates accountability problems.

Job Displacement: AI automation threatens millions of jobs. A 2024 McKinsey report estimates AI could automate 30% of work hours by 2030.

AI Hallucinations: Large language models like ChatGPT sometimes generate false information with confidence. In 2023, lawyers faced sanctions for submitting briefs with ChatGPT-generated fake legal citations (New York Times, 2024).

Deepfakes and Synthetic Media: AI can create convincing fake videos and audio. This poses risks for fraud, political manipulation, and harassment. In 2024, AI-generated celebrity endorsement scams cost victims millions (Kreidler et al., 2024).

Emerging AI Regulations

Table 4.4: EU AI Act Risk Classification

Risk LevelExamplesRequirementsRationale
UnacceptableSocial scoring systems, real-time biometric surveillance in public spacesBANNEDViolates fundamental rights
High RiskAI in hiring, credit scoring, law enforcement, critical infrastructure, educationConformity assessment, risk management, human oversight, transparency, accuracy requirements, loggingSignificant potential for harm to safety or rights
Limited RiskChatbots, emotion recognition, deepfakesTransparency obligations (must disclose AI use)Moderate concern: awareness helps users make informed decisions
Minimal RiskAI-enabled video games, spam filters, inventory managementNo specific obligationsLow risk to rights and safety

EU AI Act (2024): The world’s first comprehensive AI regulation. Uses risk-based approach. Penalties up to EUR 35 million or 7% of global revenue. Enforcement begins in phases through 2026 (The Act Text, 2024).

U.S. Executive Order on AI (2023): Requires safety testing, standards for detecting AI content, addressing bias, protecting privacy, and supporting affected workers. However, executive orders can be reversed by future administrations.

State-Level AI Regulation: The Alabama Generative AI Acceptable Use Policy emphasizes ethical principles such as accountability, transparency, and fairness in AI usage. It requires human oversight to verify accuracy, security, and appropriateness of AI-generated content, prohibiting harmful, illegal, or misleading applications. Additionally, it mandates disclosure, risk mitigation, and compliance with NIST AI Risk Management Framework to ensure responsible and ethical integration of AI technologies (Generative AI acceptable use policy 2025).

University of North Alabama Generative AI Policy: The University of North Alabama’s Generative AI Policy promotes responsible and ethical use of AI in academic and administrative contexts. It emphasizes principles such as privacy, fairness, transparency, and accountability, requiring proper attribution of AI-generated content and adherence to academic integrity standards (University of North Alabama, Generative AI policy).

Industry Self-Regulation: Major tech companies published AI principles emphasizing fairness, accountability, and transparency. Effectiveness debated as self-regulation lacks enforcement.

Responsible AI Practices

Organizations should: conduct AI impact assessments before deployment; test for bias regularly across demographic groups; maintain human oversight for consequential decisions; provide transparency about AI use; enable appeals of AI decisions with human review; diversify AI development teams; invest in workforce reskilling; set clear use policies; document AI systems thoroughly; and stay informed about evolving regulations.

Chapter Summary

Ethical, legal, and social considerations are fundamental to responsible information systems management. Organizations must navigate complex privacy regulations, protect intellectual property, address AI ethics, reduce environmental impact, and maintain high ethical standards. Success requires ongoing vigilance, stakeholder engagement, and commitment to doing what’s right—not just what’s legal.

Review Questions

  1. Explain three major privacy challenges organizations face today. For each, describe how recent events or technology developments have made this challenge more significant.
  2. Using the Ten Commandments of Computer Ethics (Table 4.5), analyze a recent technology controversy mentioned in the chapter (such as deepfakes, AI hiring tools, or surveillance capitalism). Which commandments apply and how?
  3. How has AI complicated intellectual property law? Discuss at least two specific challenges mentioned in the chapter, such as the New York Times lawsuit or questions about AI-generated content ownership.
  4. Using Table 4.3, identify three green computing strategies. For each, explain both the environmental and business benefits. Why should business leaders care about green computing beyond environmental concerns?

References

Alabama Code § 8-38 (2018). Data Breach Notification Act of 2018.

Alabama Code § 8-43 (2024). Alabama Genetic Data Privacy Act.

Alabama House of Representatives. (2025). HB449. Retrieved from AL Legislature.

BillTrack50. (2025). AL HB283 – Data privacy, processing of data regulated. https://www.billtrack50.com/billdetail/1832500

Cambridge Bitcoin Electricity Consumption Index. (2024). Cambridge Centre for Alternative Finance. https://ccaf.io/cbnsi/cbeci

CareerBuilder. (2024). Social media recruitment survey. https://resources.careerbuilder.com/employer-blog/70-of-employers-use-social-networking-sites-to-research-candidates-during-hiring-process  

Computer Ethics Institute. (1992). Ten Commandments of Computer Ethics. https://computerethics.institute/publications/ten-commandments-of-computer-ethics/

EEOC. (2024). EEOC launches initiative on artificial intelligence and algorithmic fairness. U.S. Equal Employment Opportunity Commission.

European Commission. (2023). Data protection in the EU. https://ec.europa.eu/info/law/law-topic/data-protection_en

European Parliament. (2024). EU AI Act: First regulation on artificial intelligence. https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence?tg=SEO&tag1=Entail&tag2=Entail

Federal Communications Commission. (2024). FCC restores net neutrality. https://www.fcc.gov/document/fcc-restores-net-neutrality-0

Generative AI acceptable use policy. AI-GV-P2. (2025, January 31). https://oit.alabama.gov/oit-blob-wps-01/wp-content/uploads/2025/01/Alabama-Generative-AI-Acceptable-Use-Policy.pdf

Kreidler, J., Gage, C., Santee, T., Coniglio, J. L., Richardson, J., van…, M. A., Karen, & CAMERON, D. (2024, September 9). Did a celebrity really endorse that? maybe not. Consumer Advice. https://consumer.ftc.gov/consumer-alerts/2024/04/did-celebrity-really-endorse-maybe-not

MIT Technology Review. (2023). AI bias and fairness. https://youtu.be/NgaW_p7gsRc

Morgan. S. Cybersecurity Ventures. (2024). 2024 cybercrime report. https://cybersecurityventures.com/cybersecurity-almanac-2024/

New York Times. (2023). Clearview AI faces scrutiny over facial recognition. https://www.nytimes.com/2024/06/13/business/clearview-ai-facial-recognition-settlement.html

New York Times. (2024). Lawyers sanctioned for using ChatGPT’s fake legal citations. https://www.courthousenews.com/sanctions-ordered-for-lawyers-who-relied-on-chatgpt-artificial-intelligence-to-prepare-court-brief/

NIST. (n.d.). Computer Security Resource Center. Information Technology Laboratory Computer Security Resource Center. https://csrc.nist.gov/glossary/term/privacy  

The Act Texts. EU Artificial Intelligence Act. (2024). https://artificialintelligenceact.eu/the-act/  

University of North Alabama. (2020, October 9). Privacy policy. https://www.una.edu/advancement/privacy-policy.html

U.S. Congress. (2024). Kids Online Safety Act. https://www.congress.gov/bill/118th-congress/house-bill/7891/text  

U.S. House of Representatives. (2024). Protecting Americans from Foreign Adversary Controlled Applications Act. https://www.congress.gov/bill/118th-congress/house-bill/7521

U.S. Supreme Court. (2018). Carpenter v. United States, 585 U.S., 138 S. Ct. 2206.


Credits:

Portions of this chapter were written with assistance from Claude AI (Anthropic).

Regulatory information and legal frameworks were sourced from official government and organizational publications.

Industry examples and statistics were compiled from published reports and academic research.