Fourth Edition
“Digital security and the future of AI in Ecuador” was the central theme of the 4th edition of PrivaciQ, an event that highlights the importance of privacy as a fundamental human right.
We experienced moments full off knowledge throughout the talks, which together paint a complex and urgent picture of our national reality. The debate began by proposing a paradigm shift: understanding that AI governance is not just a set of written laws, but a technical reality rooted in the design and infrastructure of systems. For accountability to be effective, a “toolbox” is needed—one that connects the political with the material—allowing foundational models to operate under an ethic of transparency and justice, rather than remaining black boxes obscured by current regulation.
This ethical challenge becomes tangible when we observe the proliferation of unprotected models such as WormGPT or FraudGPT, which have given rise to “Scam-as-a-Service.” These systems enable cybercriminals to automate and scale phishing attacks, polymorphic malware, and financial fraud at extremely low cost, using advanced evasion techniques that put both individuals and organizations at risk.
The question that lingers is whether the AI we use daily is truly ethical, especially when we see devastating cases like the “AI 700 Case” in Quito. This event exposed how the misuse of emerging technologies can lead to digital sexual violence against minors, reminding us that data protection is not an abstract concept, but a vital defense against violations of privacy and human dignity in educational and social environments.
Regional research reinforces this warning by revealing illegal markets on platforms such as Telegram, where citizens’ data from neighboring countries is traded without control—a phenomenon Ecuador cannot ignore. This vulnerability is fueled by our own exposure: every photo or recording we upload becomes a potential attack vector that AI can use to clone voices or create deepfakes that impersonate personal, professional, and corporate identities. Even the world of digital advertising is entangled in this web, where marketing ecosystems collect behavioral data that, when analyzed by language models, can become tools for commercial surveillance and invasive profiling, directly affecting our fundamental freedoms.
In this context, spaces like PrivacQ are essential for building digital sovereignty that does not depend on decisions made by big tech companies. These gatherings are where resistance and critical understanding are cultivated. It is imperative that civil society stops being a passive observer of technological advancement and becomes actively involved in the debate. Only through civic participation, education in content verification, and the demand for regulatory frameworks that prioritize human rights can we move toward a future where technology empowers us rather than surveils or harms us. Ecuador’s digital sovereignty depends on our ability to bridge the technical and the social; for this reason, we invite you to be part of these spaces, to stay informed, and to defend your right to a fair and secure digital environment.