{"id":8505,"date":"2020-06-09T09:05:06","date_gmt":"2020-06-09T16:05:06","guid":{"rendered":"https:\/\/www.hmc.edu\/about-hmc\/?p=8505"},"modified":"2020-06-10T11:49:26","modified_gmt":"2020-06-10T18:49:26","slug":"nsf-grant-supports-effort-to-identify-deep-fake-technology","status":"publish","type":"post","link":"https:\/\/www.hmc.edu\/about\/2020\/06\/09\/nsf-grant-supports-effort-to-identify-deep-fake-technology\/","title":{"rendered":"NSF Grant Supports Effort to Identify Deep Fake Technology"},"content":{"rendered":"<p>When scrolling through the news in your favorite app, you probably have a method of determining whether the content you\u2019re seeing is legitimate or fake. What\u2019s the source of the story? Who benefits from the story being shared? Are there obvious factual errors? Sometimes, it\u2019s not even necessary to read past the headline because you know, for example, that there\u2019s enough evidence to prove the Earth is, indeed, round.<\/p>\n<p>But how can you tell if the video you\u2019re watching or the audio you\u2019re hearing is illegitimate when it looks and sounds like the real thing?<\/p>\n<p>Engineering professor TJ Tsai has been awarded a National Science Foundation (NSF) grant for his project, \u201cA Cross-Verification Approach for Identifying Tampered Audio,\u201d which seeks ways to identify fake or tampered audio, such as synthetic recordings generated by deep fake technology.<\/p>\n<p>Though it\u2019s relatively new, deep fake technology is prevalent, and \u201cthere&#8217;s definitely concern that bad actors could use this technology in a very destructive way,\u201d says Tsai.<\/p>\n<p>Indeed, one system can modify videos in a photorealistic manner to lip-sync to unrelated audio recordings. Another can allow a source actor to control the facial expressions and head movements of a person in a target video. Recent advances in speech synthesis have enabled systems to learn and imitate the characteristics of a person\u2019s voice with very limited training data.<\/p>\n<p>Tsai says, \u201cRather than approaching this problem from a computer science-centric perspective: \u2018Is this video a deep fake?\u2019, this proposal approaches the problem from a history-centric perspective: \u2018Is this video a historically verifiable event?\u2019 Historians have a very robust methodology for answering this question, and we can apply this methodology to audio and video data. In the same way that a historian tests a historical claim by cross-checking against all other primary sources of information, we can test the authenticity of an audiovisual recording by cross-checking against all other primary sources of audiovisual information.\u201d<\/p>\n<p>The idea for the project came to Tsai as he was listening to a sermon at church.\u00a0\u201cThe speaker was talking about how we can know whether a historical event actually happened,\u201d Tsai says. \u201cHistorians have well-established tools for determining historical truth, and one of the primary mechanisms is cross-verifying a historical claim against other primary sources. If something is true, it will be internally consistent. If it is false, it will contradict other eyewitness accounts. At the time, I had been thinking about the problem of detecting fake or tampered videos, and it occurred to me that all of the work I had seen was focused on scrutinizing the video itself to determine if it was genuine or not. I thought it would be interesting to approach the problem from the perspective of verifying a historical claim by cross-checking against other recordings of the same event.\u201d<\/p>\n<p>With the help of six student researchers over two summers, Tsai will develop methods to cross-verify audio data in two di\ufb00erent scenarios: cross-verifying with trusted data and cross-verifying with untrusted data.<\/p>\n<p>\u201cThe proposal develops tools to counteract the spread of false audiovisual information,\u201d Tsai says. \u201cIn particular, it focuses on protecting world leaders from fake videos that might cause instability or unrest at a national level. These same tools establish the reliability of true audiovisual information. Beyond simply detecting fake videos, it provides a quantitative way to measure the reliability of audiovisual data concerning public matters.\u201d<\/p>\n<p>NSF grants are the largest share of external support for faculty research at Harvey Mudd College.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>When scrolling through the news in your favorite app, you probably have a method of determining whether the content you\u2019re [&hellip;]<\/p>\n","protected":false},"author":145,"featured_media":4819,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[12,217],"class_list":["post-8505","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-engineering","category-unlisted"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/posts\/8505","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/users\/145"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/comments?post=8505"}],"version-history":[{"count":0,"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/posts\/8505\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/media\/4819"}],"wp:attachment":[{"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/media?parent=8505"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hmc.edu\/about\/wp-json\/wp\/v2\/categories?post=8505"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}