{"id":6149,"date":"2024-11-12T19:18:28","date_gmt":"2024-11-12T19:18:28","guid":{"rendered":"https:\/\/hectorkott.com\/?p=6149"},"modified":"2024-11-12T19:18:28","modified_gmt":"2024-11-12T19:18:28","slug":"musks-influence-on-trump-could-lead-to-tougher-ai-standards-says-scientist","status":"publish","type":"post","link":"https:\/\/hectorkott.com\/?p=6149","title":{"rendered":"Musk\u2019s influence on Trump could lead to tougher AI standards, says scientist"},"content":{"rendered":"<p><\/p>\n<div>\n<div>\n<div>\n<div>\n<div>\n<p>Elon Musk\u2019s influence on a Donald Trump administration could lead to tougher safety standards for artificial intelligence, according to a <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/sep\/21\/ai-focused-tech-firms-locked-race-bottom-warns-mit-professor-max-tegmark\">leading scientist<\/a> who has worked closely with the world\u2019s richest person on addressing AI\u2019s dangers.<\/p>\n<p>Max Tegmark said Musk\u2019s support for a failed AI bill in California underlined the billionaire\u2019s continued concern over an issue that did not feature prominently in Trump\u2019s campaign.<\/p>\n<\/p>\n<p>However, Musk has warned regularly that unrestrained development of AI \u2013 broadly, computer systems performing tasks that typically require human intelligence \u2013 could be catastrophic for humanity. Last year, he was one of <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/mar\/29\/elon-musk-joins-call-for-pause-in-creation-of-giant-ai-digital-minds\">more than 30,000 signatories<\/a> to a letter calling for a pause in work on powerful AI technology.<\/p>\n<figure><\/figure>\n<p>Speaking to the Guardian at the Web Summit in Lisbon, Tegmark said Musk, who is expected to be <a href=\"https:\/\/www.theguardian.com\/technology\/2024\/nov\/06\/how-elon-musk-stands-to-benefit-from-trump-presidency\">heavily influential in the president-elect\u2019s administration<\/a>, could persuade Trump to introduce standards that prevent the development of artificial general intelligence (AGI), the term for AI systems that match or exceed human levels of intelligence.<\/p>\n<p>\u201cI do think that if Elon manages to get Trump\u2019s ear on AI issues we\u2019re more likely to get some form of safety standards, something that prevents AGI,\u201d he said.<\/p>\n<p>Tegmark, a professor specialising in AI at the Massachusetts Institute of Technology, added: \u201cHe might help Trump understand that an AGI race is a suicide race.\u201d<\/p>\n<\/p>\n<p>Tegmark said Musk\u2019s support for the SB 1047 bill in California, in the face of opposition from many of his tech peers, was a positive sign for AI safety campaigners. The bill, which required companies to stress-test large AI models before releasing them, was vetoed by the California governor, Gavin Newsom, after he said it could <a href=\"https:\/\/www.theguardian.com\/us-news\/2024\/sep\/29\/california-governor-gavin-newsom-vetoes-ai-safety-bill\">drive AI businesses from the state and hinder innovation<\/a>.<\/p>\n<p>\u201cElon Musk came out and said I\u2019m for it, I want the regulation. I do think it\u2019s not completely implausible he could persuade Trump that AI needs to be controlled,\u201d Tegmark said.<\/p>\n<p>Musk was an early supporter and <a href=\"https:\/\/futureoflife.org\/fli-projects\/elon-musk-donates-10m-to-our-research-program\/#:~:text=Contents,more%20about%20the%20pledge%20here.\">financial back<\/a><a href=\"https:\/\/futureoflife.org\/fli-projects\/elon-musk-donates-10m-to-our-research-program\/#:~:text=Contents,more%20about%20the%20pledge%20here\">er<\/a> of Tegmark\u2019s Future of Life Institute, which campaigns for safer use of cutting-edge technology. The Tesla chief executive and owner of X\u2019s <a href=\"https:\/\/www.theguardian.com\/business\/2024\/nov\/07\/trump-victory-adds-record-wealth-richest-top-10\">personal fortune has swelled significantly<\/a> since Trump\u2019s victory last week.<\/p>\n<figure><a href=\"https:\/\/www.theguardian.com\/technology\/2024\/nov\/12\/elon-musk-donald-trump-ai-artificial-general-intelligence#EmailSignup-skip-link-10\">skip past newsletter promotion<\/a><\/p>\n<aside>\n<div>\n<p>Sign up to <span>Business Today<\/span><\/p>\n<\/div>\n<p>Get set for the working day \u2013 we&#8217;ll point you to all the business news and analysis you need every morning<\/p>\n<p><span><strong>Privacy Notice: <\/strong>Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our <a rel=\"noreferrer\" href=\"https:\/\/www.theguardian.com\/help\/privacy-policy\">Privacy Policy<\/a>. We use Google reCaptcha to protect our website and the Google <a rel=\"noreferrer\" href=\"https:\/\/policies.google.com\/privacy\">Privacy Policy<\/a> and <a rel=\"noreferrer\" href=\"https:\/\/policies.google.com\/terms\">Terms of Service<\/a> apply.<\/span><\/aside>\n<p tabindex=\"0\">after newsletter promotion<\/p>\n<\/figure>\n<p>Musk launched his own AI startup last year and said the world needed to <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/jul\/13\/elon-musk-launches-xai-startup-pro-humanity-terminator-future\">worry about a \u201cTerminator future\u201d <\/a>in order to head off the worst-case scenario of AI systems evading human control. Other AI professionals have argued that focusing on apocalyptic concerns distracts from focusing on short-term problems with AI systems, such as <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/oct\/29\/ai-doomsday-warnings-a-distraction-from-the-danger-it-already-poses-warns-expert\">manipulated and misleading content<\/a>.<\/p>\n<p>Trump has vowed to repeal a Biden <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/oct\/30\/biden-orders-tech-firms-to-share-ai-safety-test-results-with-us-government\">administration executive order on AI safety<\/a>; the Republican party\u2019s election platform described it as a set of restrictions that \u201cimposes radical leftwing ideas on the development of this technology\u201d.<\/p>\n<p>The order includes requiring companies developing high-risk systems \u2013 AI models that pose a threat to national security, economic security or health and safety \u2013 to share their safety test results with the government.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p>\nFuente: https:\/\/ift.tt\/9HdApmF<br \/>\nPublicado: November 12, 2024 at 03:59AM<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Elon Musk\u2019s influence on a Donald Trump administration could lead to tougher safety standards for artificial intelligence, according to a leading scientist who has worked closely with the world\u2019s richest person on addressing AI\u2019s dangers. Max Tegmark said Musk\u2019s support for a failed AI bill in California underlined the billionaire\u2019s continued concern over an issue&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[91],"tags":[75,76,77],"class_list":["post-6149","post","type-post","status-publish","format-standard","hentry","category-news","tag-news","tag-noticias","tag-viral"],"_links":{"self":[{"href":"https:\/\/hectorkott.com\/index.php?rest_route=\/wp\/v2\/posts\/6149","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/hectorkott.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hectorkott.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hectorkott.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/hectorkott.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6149"}],"version-history":[{"count":1,"href":"https:\/\/hectorkott.com\/index.php?rest_route=\/wp\/v2\/posts\/6149\/revisions"}],"predecessor-version":[{"id":6150,"href":"https:\/\/hectorkott.com\/index.php?rest_route=\/wp\/v2\/posts\/6149\/revisions\/6150"}],"wp:attachment":[{"href":"https:\/\/hectorkott.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6149"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hectorkott.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6149"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hectorkott.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6149"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}