We're living in very dangerous times. There's so much disinformation, misinformation and deep fakes on the internet, specially on TikTok.
Not to mention that AI can create very realistic videos these days that you'll really have to look closely at those AI watermarks i.e. Sora on each and every outrageous videos out there.
It's like all of these issues rose just this year.
I remember my first encounter with ChatGPT 3 years ago. I am excited and shocked at the same time, when I realized that searching something on the net (aside from Googling it) was made even easier by ChatGPT. I'm actually grateful for AI because it's very useful in checking grammar and spelling, but I realized these days that it's even more than that these days.
Now there's a browser created by OpenAI -- the company that made ChatGPT. I'm testing it the past few days and aside from some minor bugs, it's a good browser.
I remember starting surfing the internet on Netscape Navigator back in the year 2000, during college. Then it was overrun by Microsoft's Internet Explorer. Then came Firefox from Mozilla Foundation, which I love greatly. Then Google made their own browser named Chrome. Microsoft fought back by renaming IE to Edge. There's also Safari from Apple and the Opera browser as well. Now there's ChatGPT Atlas, which I think will take a lot of share in browser users in the coming days.
Now there's the problem of Deep Fakes. Those videos with human "narrators" that are made entirely by AI -- which really resembles the real person. You'll find a lot of these deep fakes on TikTok -- celebrities and famous people endorsing different products.
You'll even find these "speakers" endorsing the products in Tagalog / Filipino. Good thing I'm a native Tagalog speaker so I really can distinguish AI from mispronounced Tagalog words. But these AI-generated videos are good and I'm guessing, a lot of people are being deceive by these videos.
I am really hoping that the Philippine Government would do something about this, just like what Finland is doing.

No comments:
Post a Comment