India’s IT Ministry issued a stern directive. The Ministry of Electronics and Information Technology (MeitY) instructed tech giants and platforms that AI Can’t Interfere in Elections. Specifically, they must ensure their AI models, algorithms, and computing resources do not exhibit bias or discrimination. Additionally, they cannot threaten electoral integrity.

Moreover, MeitY released an advisory on March 15th. It emphasized proper labeling requirements. Firstly, firms must clearly label under-tested, unreliable AI foundational models as potentially fallible or unreliable. Secondly, they must do the same for generative AI before making them available.

Furthermore, when platforms facilitate synthetic content generation potentially usable as misinformation or deepfakes, they must take action. They must embed permanent metadata or unique identifiers in such text, audio, visual, or audiovisual content.

Consequently, MeitY issued this advisory after finding intermediaries negligent. They failed to exercise due diligence obligations. These are outlined under the IT Rules, 2021.

Also Read:

Additionally, experts welcomed this move as positive. Notably, Amol Kulkarni of CUTS International stated it “removes the requirement of getting government permission for using under-tested AI models.” He added it allows “Industry to focus on research and innovation without excessive compliance worries.”

However, Kulkarni highlighted potential user experience issues. Information overload and frequent pop-ups could arise. Therefore, he advocated a multi-pronged approach to tackle AI/LLM ecosystem challenges.

Moreover, Kulkarni appreciated MeitY’s openness to feedback. The “new advisory showcases the government’s proactiveness,” he said. Consequently, he urged industry to “capitalize on this collaborative approach.”

Ultimately, this advisory underscores increasing government scrutiny over AI’s role. Misinformation and electoral interference are key concerns. As AI rapidly evolves, balancing innovation and responsible deployment remains a priority.

Shares: