{"id":2031,"date":"2022-02-18T15:31:18","date_gmt":"2022-02-18T22:31:18","guid":{"rendered":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/?p=2031"},"modified":"2022-02-18T15:31:21","modified_gmt":"2022-02-18T22:31:21","slug":"as-deepfake-gets-deeper-security-risks-heighten","status":"publish","type":"post","link":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/2022\/02\/18\/as-deepfake-gets-deeper-security-risks-heighten\/","title":{"rendered":"As Deepfake gets Deeper, Security Risks Heighten"},"content":{"rendered":"\n<p>An emerging social engineering attack combines aspects of both misinformation and cyberattacks compromising data integrity: deepfakes.<\/p>\n\n\n\n<p>Deepfake is a term that combines the words &#8220;deep learning&#8221; and &#8220;fakes,&#8221; which refers to synthetic videos, images, and audio recordings generated through deep learning AI techniques. While there is a positive side to the deepfake when accompanied with consent of the person depicted, In the wrong hands, deepfakes can cause considerable damage.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"how-could-deepfakes-compromise-security\">How could deepfakes compromise security?<\/h3>\n\n\n\n<p>Deepfake attackers attempt to impersonate a person or persons of authority to spread misinformation or manipulate others into providing access to confidential\u00a0 data and funds.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"alignleft size-full is-resized\"><img decoding=\"async\" data-src=\"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/960x0.jpeg\" alt=\"\" class=\"wp-image-2033 lazyload\" width=\"614\" height=\"380\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/960x0.jpeg 960w, https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/960x0-300x186.jpeg 300w, https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/960x0-768x477.jpeg 768w\" data-sizes=\"(max-width: 614px) 100vw, 614px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 614px; --smush-placeholder-aspect-ratio: 614\/380;\" \/><figcaption><br><em>FBI warning<\/em> https:\/\/www.forbes.com\/sites\/glenngow\/2021\/05\/02\/the-scary-truth-behind-the-fbi-warning-deepfake-fraud-is-here-and-its-serious-we-are-not-prepared\/?sh=5834dbeb3179<\/figcaption><\/figure><\/div>\n\n\n\n<p>In March 2021, the FBI released a warning about the rising threat of synthetic content. The FBI warns that attackers use deepfake technology to create highly realistic spearphishing messages. It is expected that attackers will supplement voice spearphishing attacks with audio deepfakes aimed at persuading a specific individual to share or allow access to personal or corporate information. Additionally, the FBI warned about Business Identity Compromise (BIC)- a new cyberattack vector that evolves from Business Email Compromise(BEC). BIC uses audio deepfakes to create &#8220;synthetic corporate personas&#8221; or impersonates existing employees to elicit fraudulent funds transfers.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"alignleft size-large is-resized\"><img decoding=\"async\" data-src=\"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/03Ziw34njaGkafJTHFYQwdY-1.fit_scale.size_1028x578.v1645027028-1024x576.jpg\" alt=\"\" class=\"wp-image-2034 lazyload\" width=\"864\" height=\"485\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/03Ziw34njaGkafJTHFYQwdY-1.fit_scale.size_1028x578.v1645027028-1024x576.jpg 1024w, https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/03Ziw34njaGkafJTHFYQwdY-1.fit_scale.size_1028x578.v1645027028-300x169.jpg 300w, https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/03Ziw34njaGkafJTHFYQwdY-1.fit_scale.size_1028x578.v1645027028-768x432.jpg 768w, https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-content\/uploads\/sites\/115\/2022\/02\/03Ziw34njaGkafJTHFYQwdY-1.fit_scale.size_1028x578.v1645027028.jpg 1027w\" data-sizes=\"(max-width: 864px) 100vw, 864px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 864px; --smush-placeholder-aspect-ratio: 864\/485;\" \/><figcaption>https:\/\/www.pcmag.com\/news\/fbi-dont-fall-for-this-money-transfer-video-chat-scam?amp=true<\/figcaption><\/figure><\/div>\n\n\n\n<p>More recently, the FBI issued another warning about an increase in fraudsters exploiting virtual meeting platforms. Some schemes involve impersonating company executives using deepfake technology in video meetings by hijacking their video meeting accounts. The rise of video conferencing during the pandemic has given cybercriminals a new avenue to trick employees into wiring company funds.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"deepfakes-as-a-threat-to-organizations\">Deepfakes as a threat to organizations<\/h3>\n\n\n\n<p>An employee of a UK CEO was defrauded into transferring US$243,000 to a Hungarian supplier&#8217;s bank account by a voice fake in 2019. It is believed that the threat actors used commercial voice-generating software to carry out the attack. This was the first known example of a deepfake being used in a scam.<br><\/p>\n\n\n\n<p>In 2021, A manager at the bank received a phone call from one of the bank&#8217;s directors asking for a $35 million transfer to fund the acquisition. In reality, it was not the director who called. It was a deepfake of the director&#8217;s voice. \u00a0By the time the bank became aware of the error, the funds had already been lost.<br><\/p>\n\n\n\n<p>Deepfake technology is becoming more accessible and easier to use, posing greater risks to organizations. Millions of dollars have already been scammed with audio deepfakes, and the deepfake technology\u00a0is expected to get more sophisticated.\u00a0Moreover, deepfakes are not limited to spearphishing attacks or BICs.\u00a0There have already been video deepfakes that bypass facial recognition technology, and they will soon be able to bypass voice recognition technology as well.\u00a0With technology that can fool authentication factors, such as biometrics, there is a much greater risk of security compromise. Organizations should update their security protocols as the potential risk grows.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"how-to-protect-against-deepfake-attacks\">How to protect Against Deepfake Attacks<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"employee-training\">Employee Training<\/h4>\n\n\n\n<p>Strengthen your first line of defence against deepfakes by training staff to spot them<\/p>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"trust-but-verify\">Trust but Verify<\/h4>\n\n\n\n<p>To detect an attack before it can cause any harm, implement protocols that specify verification procedures for suspicious communications<\/p>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"automated-detection\">Automated Detection<\/h4>\n\n\n\n<p>Automated detection can also be achieved with the same algorithms used to create deepfakes<\/p>\n\n\n\n<h4 class=\"wp-block-heading\" id=\"response-strategy\">Response Strategy<\/h4>\n\n\n\n<p>Deepfakes should be handled in incident response plans, and stakeholders need to know how to respond when they are attacked<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"references\">References<\/h3>\n\n\n\n<p><a href=\"https:\/\/www.entrepreneur.com\/article\/414109\">https:\/\/www.entrepreneur.com\/article\/414109<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.forbes.com\/sites\/glenngow\/2021\/05\/02\/the-scary-truth-behind-the-fbi-warning-deepfake-fraud-is-here-and-its-serious-we-are-not-prepared\/?sh=5834dbeb3179\">https:\/\/www.forbes.com\/sites\/glenngow\/2021\/05\/02\/the-scary-truth-behind-the-fbi-warning-deepfake-fraud-is-here-and-its-serious-we-are-not-prepared\/?sh=5834dbeb3179<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.pcmag.com\/news\/fbi-dont-fall-for-this-money-transfer-video-chat-scam?amp=true\">https:\/\/www.pcmag.com\/news\/fbi-dont-fall-for-this-money-transfer-video-chat-scam?amp=true<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.accdocket.com\/deepfakes-get-deeper-security-risks-heighten\">https:\/\/www.accdocket.com\/deepfakes-get-deeper-security-risks-heighten<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.pandasecurity.com\/en\/mediacenter\/technology\/deepfake-fraud\/#:~:text=Deepfakes%20are%20videos%2C%20images%20or,been%20manipulated%20by%20AI%20technology.&amp;text=This%20has%20become%20a%20growing,of%20misinformation%20and%20fraud%20scams\">https:\/\/www.pandasecurity.com\/en\/mediacenter\/technology\/deepfake-fraud\/#:~:text=Deepfakes%20are%20videos%2C%20images%20or,been%20manipulated%20by%20AI%20technology.&amp;text=This%20has%20become%20a%20growing,of%20misinformation%20and%20fraud%20scams<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/builtin.com\/cybersecurity\/deepfake-phishing-attacks\">https:\/\/builtin.com\/cybersecurity\/deepfake-phishing-attacks<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>An emerging social engineering attack combines aspects of both misinformation and cyberattacks compromising data integrity: deepfakes. Deepfake is a term that combines the words &#8220;deep learning&#8221; and &#8220;fakes,&#8221; which refers to synthetic videos, images, and audio recordings generated through deep learning AI techniques. While there is a positive side to the deepfake when accompanied with &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/2022\/02\/18\/as-deepfake-gets-deeper-security-risks-heighten\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;As Deepfake gets Deeper, Security Risks Heighten&#8221;<\/span><\/a><\/p>\n","protected":false},"author":390,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[15],"tags":[],"class_list":["post-2031","post","type-post","status-publish","format-standard","hentry","category-cpsc-329-602-w22","entry"],"featured_image_src":null,"featured_image_src_square":null,"author_info":{"display_name":"Seyeon Sim","author_link":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/author\/seyeon-sim\/"},"_links":{"self":[{"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/posts\/2031","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/users\/390"}],"replies":[{"embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/comments?post=2031"}],"version-history":[{"count":9,"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/posts\/2031\/revisions"}],"predecessor-version":[{"id":2044,"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/posts\/2031\/revisions\/2044"}],"wp:attachment":[{"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/media?parent=2031"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/categories?post=2031"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/isec-601-f21\/wp-json\/wp\/v2\/tags?post=2031"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}