| Deutsch English Français Italiano |
|
<vtonms$2jvm8$1@dont-email.me> View for Bookmarking (what is this?) Look up another Usenet article |
Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!eternal-september.org!.POSTED!not-for-mail From: "Adam H. Kerman" <ahk@chinet.com> Newsgroups: rec.arts.tv Subject: Re: The Art Of Poison-Pilling Music Files Date: Wed, 16 Apr 2025 17:00:13 -0000 (UTC) Organization: A noiseless patient Spider Lines: 55 Message-ID: <vtonms$2jvm8$1@dont-email.me> References: <t2mvvj1uuln5h9oco65ph96qhtckkpabft@4ax.com> Injection-Date: Wed, 16 Apr 2025 19:00:16 +0200 (CEST) Injection-Info: dont-email.me; posting-host="4b287d6c70b211088095deddd49f4304"; logging-data="2752200"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19Y1I9zxaFdL75ule+FLwdcBkhR5oMvaGg=" Cancel-Lock: sha1:f3sLQ2CJtpIHUlgQSZh09Vkk6xU= X-Newsreader: trn 4.0-test77 (Sep 1, 2010) Bytes: 3658 I'm not going to tag this off topic as we regularly discuss how allegations of copyright infringement thwart creativity. shawn <nanoflower@notforg.m.a.i.l.com> wrote: >This is both great and scary. The idea is that you can encode messages >into music or other sounds that an AI can pick up that no human would >notice. In this video the designer goes into how he introduces the >poison pills that do things like tell Alexa or Siri to do things while >sounding perfectly normal to us. >https://www.youtube.com/watch?v=xMYm2d9bmEA >The Art Of Poison-Pilling Music Files >He has an example of what sounds like a simple song from someone >playing a piano. Nothing at all unusual in how it sounds. Yet the AI >picks up a series of instructions and pops up a video. As he points >out this could easily be an instruction to unlock all your doors. I'm not going to panic. I'm not blaming the fact of AI. I'm blaming the consumer for having purchased "Internet of things" without considering unintended consequences. Without considering sabotage via AI, ANY connected device relies upon regular software upgrades plus the ability to interact with the server. Well, the manufacturer cannot be required to provide support beyond the warranty period, or stay in that line of business in which they even have the ability to provide support. A famous example was OnStar technology in GM vehicles. That relied upon a satellite. At some point, either the satellite was decommissioned or GM didn't renew its contract. Anybody could find himself subject to failure of mission-critical technology in a scenario without a way to override manually. AI introduces all new unfortunate circumstances, but the moment technology is networks, the genie has already been let out of the bottle. As far as embedding code, that's been done ever since the invention of the written word. People want to communicate privately and would attempt to encode messages into writing that on its face appears to be something else. Music itself has been useful for broadcast of encoded messages that only those who know what to look for would notice. >In his case, as a musician, he wants to stop AI companies from using >his and others music without proper compensation. Hence the idea of >introducing a "poison pill" into their music that will corrupt the AI >data bases if they use that music, and yet will sound perfectly normal >to any human. That's nice. What if a music historian uses AI to analyze the music he composed to identify each and every chord change and harmonic progression he "borrowed" from existing music? You can find anything in a Gregorian chant or Bach. Fuck him. He didn't pay any royalties. By his own logic, he should be put out of business.