Ethical Considerations in AI: Navigating Bias, Fairness, and Accountability

Authors

  • Rahul Jain, Deepti Pathak, Manav Chandan

DOI:

https://doi.org/10.48047/resmil.v10i1.22

Keywords:

AI Ethics, Bias in AI, Fairness in Machine Learning, Algorithmic Accountability, Ethical AI Governance

Abstract

The reconciliation of man-made reasoning (artificial intelligence) and enormous information examination in dynamic cycles has introduced another period of mechanical headways and extraordinary abilities across different areas. Notwithstanding, this expanding collaboration has likewise caused a corresponding ascent in moral quandaries and contemplations. This examination article explores the complex scene of moral issues in man-made intelligence fueled direction, with a specific accentuation on the moral contemplations related with large information driven choice cycles. Drawing from an extensive survey of the current writing, this article enlightens the different moral systems pertinent to man-made intelligence and enormous information morals. It takes apart unambiguous moral issues that arise with regards to computer based intelligence navigation, including algorithmic predisposition, straightforwardness, and responsibility, while likewise investigating the unpredictable moral contemplations involved in the assortment and use of huge information, like information protection, security, and informed assent. To empirically investigate the scope and repercussions of these ethical quandaries, the study employs a mixed-method approach that combines qualitative and quantitative data analysis. The discoveries highlight the squeezing need to create and carry out moral structures to direct artificial intelligence and huge information navigation, as well as to propose useful suggestions for alleviating these moral difficulties.

Downloads

Published

2020-04-15

How to Cite

Rahul Jain, Deepti Pathak, Manav Chandan. (2020). Ethical Considerations in AI: Navigating Bias, Fairness, and Accountability. RES MILITARIS, 10(1), 149–155. https://doi.org/10.48047/resmil.v10i1.22

Issue

Section

Articles