June Edition 2023

63 approach of the ML/copyright issue, it is important to remember that this is only a guideline, and the final legislation may take a different approach. As such, the opinion serves as an interesting starting point for a broader conversation about the intersection of AI and copyright law, rather than a final word on these matters. In addition, while the Ministry's opinion provides valuable insights into the implications of copyright law on the creation of machine learning datasets, it stops short of addressing the question of who holds the copyrights to the outputs of the NLP process. This is a significant area of concern that warrants further exploration. The outputs of NLP raise several intellectual property questions. For instance, who owns the copyright of a text generated by an AI? Is it the developer of the AI, the user who provided the input, or, strangely enough, is it the AI itself (despite the fact that, as of now, AI systems are not recognized as legal entities capable of holding copyrights)? Furthermore, if an AI generates a text that infringes on someone else's copyright, who is liable? These questions become even more complex when considering that AI can generate outputs that were not explicitly programmed by its developers, making it difficult to predict and control the AI's actions and the NLP output and results. As we delve deeper into the realm of AI and NLP, it is also crucial to address the significant privacy concerns that accompany these technological advancements. These concerns primarily stem from the extensive data collection and processing required for AI and NLP systems to function effectively. The vast amounts of data, often encompassing personal information, raise questions about user awareness and consent. Furthermore, the potential misuse of personal information, through detailed profiling of individuals based on their online behavior, preferences, and interactions, is another area of concern. Adding to these issues is the lack of transparency often associated with AI systems, sometimes referred to as "black boxes" due to their complex and opaque decisionmaking processes. This lack of transparency can make it difficult for individuals to understand how their data is being used and processed. The existing privacy laws may not be fully equipped to handle the unique challenges posed by AI and NLP, as they were simply regulated during times in which AI and NLP were almost science fiction. The Privacy Protection Authority (within the Ministry of Justice) recently published an opinion addressing the privacy concerns associated with "deep fake" technologies, which can create convincingly realistic but entirely