{"id":5501,"date":"2019-04-09T16:23:44","date_gmt":"2019-04-09T15:23:44","guid":{"rendered":"http:\/\/qpol.qub.ac.uk\/?p=5501"},"modified":"2019-04-09T16:23:44","modified_gmt":"2019-04-09T15:23:44","slug":"why-artificial-intelligence-needs-democratic-governance","status":"publish","type":"post","link":"https:\/\/blogs.qub.ac.uk\/qpol\/why-artificial-intelligence-needs-democratic-governance\/","title":{"rendered":"Why Artificial Intelligence Needs Democratic Governance"},"content":{"rendered":"\n<p>Popular&nbsp;<a href=\"https:\/\/www.theverge.com\/2018\/7\/7\/17538112\/the-robots-of-gotham-todd-mcaulty-hero-uprising-science-fiction-book-review\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.theverge.com\/2018\/7\/7\/17538112\/the-robots-of-gotham-todd-mcaulty-hero-uprising-science-fiction-book-review&amp;source=gmail&amp;ust=1554761874481000&amp;usg=AFQjCNHsd1rXbU0SrVdfx5UXcJ1uR-naVg\">depictions of artificial intelligence-based systems<\/a>&nbsp;present a frightening and opaque Orwellian power that appears to dominate the lives of human beings \u2013 think Terminator or&nbsp;<a href=\"https:\/\/www.nytimes.com\/2018\/03\/30\/movies\/hal-2001-a-space-odyssey-voice-douglas-rain.html\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.nytimes.com\/2018\/03\/30\/movies\/hal-2001-a-space-odyssey-voice-douglas-rain.html&amp;source=gmail&amp;ust=1554761874481000&amp;usg=AFQjCNF4Y85e8e1-KwlAsitt5D6DoNJ8sw\">Hal 9000<\/a>. In my view, the idea that AI will take over the world is misplaced. However, I do believe AI poses serious threats to democratic politics, democratic institutions, and our capacity and right to engage freely in democratic practices.<\/p>\n\n\n\n<p>We must urgently address these threats by strengthening democratic control over those who design and develop AI, profit from it, and can use it to the detriment of democratic politics.<\/p>\n\n\n\n<p><strong><br>\nArtificial Intelligence and Bias<\/strong><\/p>\n\n\n\n<p>Recent concerns over AI focused on its potential to&nbsp;<a href=\"https:\/\/theglobepost.com\/2018\/10\/05\/ai-racial-bias\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/10\/05\/ai-racial-bias\/&amp;source=gmail&amp;ust=1554761874481000&amp;usg=AFQjCNET4s_Py29mWz9KCRyIfihkkBtaVg\">compound existing racial and gender-based inequalities<\/a>. For example,&nbsp;<a href=\"https:\/\/www.media.mit.edu\/people\/joyab\/overview\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.media.mit.edu\/people\/joyab\/overview\/&amp;source=gmail&amp;ust=1554761874481000&amp;usg=AFQjCNEBE2wkx0WV5_plWt3NenJmTcwp-g\">research by<\/a>&nbsp;MIT scholar&nbsp;<strong>Joy Buolamwini<\/strong>&nbsp;and UCLA Professor&nbsp;<strong>Safiya Noble<\/strong>&nbsp;has demonstrated how algorithms trained on racially biased data sets&nbsp;<a href=\"https:\/\/www.ajlunited.org\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.ajlunited.org\/&amp;source=gmail&amp;ust=1554761874481000&amp;usg=AFQjCNG89CjaSwXsQOuGaFji8lAtSaWfCA\">discriminate against people of color<\/a>, especially against women of color.<\/p>\n\n\n\n<p>The standard response to this concern is a&nbsp;<a href=\"https:\/\/www.theverge.com\/2019\/1\/25\/18197137\/amazon-rekognition-facial-recognition-bias-race-gender\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.theverge.com\/2019\/1\/25\/18197137\/amazon-rekognition-facial-recognition-bias-race-gender&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNEYQfTxU355uPU1DZeLXPVbvX8Dfg\">call for diversity<\/a>: we should use a diverse range of training data for algorithms, for example by including more dark-skinned and female faces when developing training data for face recognition technologies. We should also diversify the pool of AI developers beyond the predominantly pale and male staff.<\/p>\n\n\n\n<p><strong><br>\nAI\u2019s Structural Threat to Democracies<\/strong><\/p>\n\n\n\n<p>The call to&nbsp;<a href=\"https:\/\/www.technologyreview.com\/s\/610637\/for-better-ai-diversify-the-people-building-it\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.technologyreview.com\/s\/610637\/for-better-ai-diversify-the-people-building-it\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHPap6EVg9Iumi7r4fnlvO1q88ifQ\">diversify the AI workforce<\/a>&nbsp;has become shorthand for democratizing AI. Such focus is undoubtedly important and welcome: a diverse workforce brings a range of life experiences and perspectives to the workplace and workplace practices. But diversity does not tackle AI\u2019s structural&nbsp;<a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2017\/10\/what-facebook-did\/542502\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.theatlantic.com\/technology\/archive\/2017\/10\/what-facebook-did\/542502\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNGxPBik7Mny5iOz1VlpvkKsXJaVzA\">threat to our democracies<\/a>. This threat runs deeper than the \u201cadd color, gender, and stir\u201d approach.<\/p>\n\n\n\n<p><strong>Paul Nemitz<\/strong>, Principal Advisor in the European Commission,&nbsp;<a href=\"http:\/\/rsta.royalsocietypublishing.org\/content\/376\/2133\/20180089\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=http:\/\/rsta.royalsocietypublishing.org\/content\/376\/2133\/20180089&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNG_p5VwOTkzCevFt7UXCn8bXRVjvg\">argues that<\/a>&nbsp;AI technologies have the potential to distort the infrastructure of democratic deliberation and its context.<\/p>\n\n\n\n<p>AI provides the tools that enable direct interference with democratic processes, for example by facilitating practices of misinformation and by&nbsp;<a href=\"https:\/\/www.nytimes.com\/2012\/07\/24\/business\/media\/survey-shows-voters-are-wary-of-tailored-political-ads.html\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.nytimes.com\/2012\/07\/24\/business\/media\/survey-shows-voters-are-wary-of-tailored-political-ads.html&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNGtNriBz-FUy5owAl0Nlwd8sETzig\">microtargeting prospective voters<\/a>&nbsp;through individually tailored political advertising. Such microtargeting exploits individual fears and vulnerabilities for political gain. It undermines the shared information basis of political communities, and destroys what philosopher&nbsp;<strong>Hannah Arendt<\/strong>&nbsp;called our \u201c<a href=\"https:\/\/link.springer.com\/article\/10.1007\/s11217-018-9618-3\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/link.springer.com\/article\/10.1007\/s11217-018-9618-3&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNGZawfeIuu2N3uounzrXqR-4iycUg\">common world<\/a>.\u201d<\/p>\n\n\n\n<p><strong><br>\nRegulating Tech Companies<\/strong><\/p>\n\n\n\n<p>The actions of&nbsp;<a href=\"https:\/\/theglobepost.com\/2019\/03\/12\/facebook-anniversary-user-data\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2019\/03\/12\/facebook-anniversary-user-data\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHZqnLgqC78NHJRihK6sqsvWtxAlQ\">Facebook<\/a>&nbsp;and&nbsp;<a href=\"https:\/\/theglobepost.com\/2018\/06\/05\/cambridge-analyticas-privacy\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/06\/05\/cambridge-analyticas-privacy\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHp3T5MC9QATywcT4sQaUUBFRLj9g\">Cambridge Analytica<\/a>&nbsp;in the 2016&nbsp;<a href=\"https:\/\/theglobepost.com\/2019\/03\/29\/brexit-day-uk\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2019\/03\/29\/brexit-day-uk\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNFo9_TkGZzZkeEFQD50gdtQ_hAXJQ\">Brexit<\/a>&nbsp;referendum and the&nbsp;<a href=\"http:\/\/nymag.com\/intelligencer\/2018\/03\/facebook-haunted-by-its-handling-of-2016-election-meddling.html\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=http:\/\/nymag.com\/intelligencer\/2018\/03\/facebook-haunted-by-its-handling-of-2016-election-meddling.html&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNG_2_CvpHBvMf4HQgjhKXaFDs83FA\">U.S. presidential election<\/a>&nbsp;of the same year have become textbook examples of digitally mediated interferences in the democratic process. Worryingly,&nbsp;<a href=\"https:\/\/www.theguardian.com\/politics\/2019\/apr\/04\/inquiry-launched-into-data-use-from-no-deal-brexit-ads-on-facebook\" target=\"_blank\" rel=\"noopener noreferrer\">recent evidence<\/a>&nbsp;suggests that such practices continue.<\/p>\n\n\n\n<p>These interferences benefit from the almost unfettered power of&nbsp;<a href=\"https:\/\/theglobepost.com\/2019\/03\/28\/huawei-us-threat\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2019\/03\/28\/huawei-us-threat\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNGuxhGApQLWeu1pQIANmyoAaF-kCw\">big tech companies<\/a>, their respective models of&nbsp;<a href=\"https:\/\/www.theguardian.com\/technology\/2019\/jan\/20\/shoshana-zuboff-age-of-surveillance-capitalism-google-facebook\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.theguardian.com\/technology\/2019\/jan\/20\/shoshana-zuboff-age-of-surveillance-capitalism-google-facebook&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNElzaaQfkOp-_lBtOYANcTT997J9g\">surveillance capitalism<\/a>,&nbsp;and their capacity to undermine democratic processes, practices, and institutions.<\/p>\n\n\n\n<p>Regulating the tech giants and curtailing their influence on democratic politics is an urgent task. This task requires effective oversight over AI companies, conducted by AI-literate, democratic institutions, but also by other participants in the democratic process, including journalists, NGOs, academics, and the broader citizenry.<\/p>\n\n\n\n<p>&nbsp;<\/p>\n\n\n\n<p><strong>AI Concerns<\/strong><\/p>\n\n\n\n<p>The effective regulation of tech giants must also tackle&nbsp;<a href=\"https:\/\/theglobepost.com\/2019\/03\/19\/consequences-of-inequality\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2019\/03\/19\/consequences-of-inequality\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNE0N1NFq1YA49JMz8CbWK9bN8HGKQ\">wealth inequalities<\/a>&nbsp;that these companies create, whether in their local communities or globally.<\/p>\n\n\n\n<p>Further, oversight must strengthen the&nbsp;<a href=\"https:\/\/www.washingtonpost.com\/business\/2018\/12\/13\/how-tech-workers-are-fueling-new-employee-activism-movement\/?noredirect=on&amp;utm_term=.ec432e42ef06\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.washingtonpost.com\/business\/2018\/12\/13\/how-tech-workers-are-fueling-new-employee-activism-movement\/?noredirect%3Don%26utm_term%3D.ec432e42ef06&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNEatKfvgRWXEEnxAsoYq_99fJyD9w\">employment rights of tech workers<\/a>. These employees have been at the forefront of campaigns that monitor the practices of tech companies, and tech workers have been instrumental in highlighting projects whose sole purpose seems to be enhanced state surveillance and targeting vulnerable people.<\/p>\n\n\n\n<p>For example,&nbsp;<a href=\"https:\/\/theglobepost.com\/2018\/08\/18\/google-china-impact\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/08\/18\/google-china-impact\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNFUNs745ZSlTvM6qdF0Rmor5aWRag\">Google<\/a>&nbsp;employees called out their company\u2019s \u2013 now abandoned \u2013<a href=\"https:\/\/thinkprogress.org\/google-dragonfly-china-censorship-human-rights-fe1fe1ddf55a\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/thinkprogress.org\/google-dragonfly-china-censorship-human-rights-fe1fe1ddf55a\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHJyb-4AUz-rd4ay258TGxOXhP7uw\">Dragonfly project<\/a>&nbsp;and its \u2013 also discontinued \u2013 participation in&nbsp;<a href=\"https:\/\/thinkprogress.org\/google-dragonfly-china-censorship-human-rights-fe1fe1ddf55a\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/thinkprogress.org\/google-dragonfly-china-censorship-human-rights-fe1fe1ddf55a\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHJyb-4AUz-rd4ay258TGxOXhP7uw\">Project Maven<\/a>, while Amazon staff asked their company to stop selling&nbsp;<a href=\"https:\/\/theglobepost.com\/2019\/03\/31\/facial-recognition-bias\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2019\/03\/31\/facial-recognition-bias\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNG7iuBFCGhmwlfm-Xmo0EenN4LUFw\">face recognition software<\/a>&nbsp;to law enforcement agencies.<\/p>\n\n\n\n<p>These examples highlight two additional concerns: the&nbsp;<a href=\"https:\/\/theglobepost.com\/2018\/08\/02\/google-china-search-engine\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/08\/02\/google-china-search-engine\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNEZTdUmbBDYXmHx-yVzBZQkEyf8AQ\">collaboration between private corporations and state agencies<\/a>&nbsp;tasked with security, intelligence, and criminal justice responsibilities; and the design and&nbsp;<a href=\"https:\/\/theglobepost.com\/2018\/06\/27\/china-us-artificial-intelligence\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/06\/27\/china-us-artificial-intelligence\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNEnZcG4hz0hu0D2Bs3BmyyT5zCxyA\">development of AI-based technologies<\/a>&nbsp;with a negative impact on human rights, civil liberties, and the capacity to participate in democratic politics.<\/p>\n\n\n\n<p>&nbsp;<\/p>\n\n\n\n<p><strong>AI as Technology of Control<\/strong><\/p>\n\n\n\n<p>Our rights-based democracies are vulnerable to the enormous&nbsp;<a href=\"https:\/\/www.theverge.com\/2018\/1\/23\/16907238\/artificial-intelligence-surveillance-cameras-security\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.theverge.com\/2018\/1\/23\/16907238\/artificial-intelligence-surveillance-cameras-security&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHlh31wQaV7QW1JJDAfgfnleO_bcA\">surveillance capacity of AI-driven systems<\/a>, which undermine our right to&nbsp;<a href=\"https:\/\/theglobepost.com\/2018\/12\/20\/facebook-data-protection\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/12\/20\/facebook-data-protection\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNGd3AnIOBC2CB9tFOD5p5xXUd388A\">privacy<\/a>&nbsp;and which interfere with the rights to freedom of expression and movement.<\/p>\n\n\n\n<p>I do not deny that AI can be a force for good. But when used badly, wrongly, or with malicious intent, it becomes a technology of control. For example, the rolling out of face recognition technology in public spaces, or practices such as&nbsp;<a href=\"https:\/\/www.smithsonianmag.com\/innovation\/artificial-intelligence-is-now-used-predict-crime-is-it-biased-180968337\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.smithsonianmag.com\/innovation\/artificial-intelligence-is-now-used-predict-crime-is-it-biased-180968337\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNEoGWb2ms-gtcap-hx7RTURe7i9yg\">predictive policing<\/a>, create a chill-factor that undermines the political culture where democratic politics thrives.<\/p>\n\n\n\n<p><a href=\"https:\/\/theglobepost.com\/2018\/04\/10\/social-media-engines\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/04\/10\/social-media-engines\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNGqluJrlc4FA4UQtUsGVPVMODHzxg\">Data harvesting by private corporations<\/a>&nbsp;and data sharing between those corporations and law enforcement agencies compound the threat to individual human rights, civil liberties, and the framework of democratic politics.<\/p>\n\n\n\n<p>&nbsp;<\/p>\n\n\n\n<p><strong>Artificial Intelligence Paradox<\/strong><\/p>\n\n\n\n<p><a href=\"https:\/\/theglobepost.com\/2018\/12\/18\/artificial-intelligence-human-labor\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2018\/12\/18\/artificial-intelligence-human-labor\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNHLqXWVn5c2_Hl4rwrAKtbuJY7cSg\">Artificial intelligence<\/a>&nbsp;presents us with a paradox: AI is here to stay, and we increasingly rely on AI systems in our everyday lives and in practices as democratically engaged citizens, even in our criticism of AI.<\/p>\n\n\n\n<p>We must urgently decide what&nbsp;<a href=\"https:\/\/www.newstatesman.com\/science-tech\/technology\/2018\/08\/how-ai-could-kill-democracy-0\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.newstatesman.com\/science-tech\/technology\/2018\/08\/how-ai-could-kill-democracy-0&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNFO3_krfao7ZtbgZjG74yvGPkLADw\">AI-mediated democracies<\/a>&nbsp;should look like. Can the AI genie released from the bottle of human creativity be democratized?<\/p>\n\n\n\n<p>Democratic practices and values, including&nbsp;<a href=\"https:\/\/theglobepost.com\/2019\/01\/30\/globalization-inequality\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/theglobepost.com\/2019\/01\/30\/globalization-inequality\/&amp;source=gmail&amp;ust=1554761874482000&amp;usg=AFQjCNF0xeTT_AYbPQzMK6Ym7skutXQEjQ\">equality<\/a>, participation, and accountability, must underpin the governance of AI, from the design and development stages to its application by users. AI needs democratic governance.<\/p>\n\n\n\n<p>&nbsp;<\/p>\n\n\n\n<p>Article originally appeared on <a href=\"https:\/\/theglobepost.com\/2019\/04\/09\/artificial-intelligence-democratic-governance\/\" target=\"_blank\" rel=\"noopener\">The Globe Post.&nbsp;<\/a><\/p>\n\n\n\n<p><em>The&nbsp;<a href=\"https:\/\/www.flickr.com\/photos\/6eotech\/46829072852\/in\/photolist-2em8eRN-78gAtb-H3yrFH-284JoNU-29q8n2i-dLSKTQ-23zLnKT-2aJ6poY-M3wygP-SjCgwQ-57T5n8-29REGc9-ShBxCm-M3wziP-7Mbzwn-NETXus-283EziL-283EzSG-2aJ6qs1-NETZWS-2aNvjqZ-283ExjA-NETZhA-6zeQLo-WKgYZG-4KTiY4-emaoyD-24DS86P-283EBE9-M3wyA6-283EA8G-TBtZ31-feRjWf-283EBvb-2aJ6ta5-TnPTMx-9hMiUz-brdXxC-283EAmh-MCzawX-NsNhED-W7ehfL-N99HXf-4Wrk4s-N99KLW-4GXcBN-5JpQEE-W7ehHu-3aPuVg-cv8pw1\" target=\"_blank\" rel=\"noopener\">featured&nbsp;image<\/a><\/em>&nbsp;<em>has been used courtesy of a&nbsp;<\/em><a href=\"https:\/\/creativecommons.org\/licenses\/by-nc\/2.0\/\" target=\"_blank\" rel=\"noopener\"><em>Creative Commons license.&nbsp;<\/em><\/a><\/p>\n\n\n\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>AI poses serious threats to democratic politics and institutions, as well as our capacity and right to engage freely in democratic practices says Dr Birgit Schippers. <\/p>\n","protected":false},"author":2533,"featured_media":5505,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[444],"tags":[742,743],"class_list":["post-5501","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-information-technology","tag-ai","tag-artificial-intelligence"],"mb":[],"acf":{"authors":{"simple_value_formatted":"","value_formatted":null,"value":null,"field":{"ID":9774,"key":"field_66d0cbf58f930","label":"Authors","name":"authors","aria-label":"","prefix":"acf","type":"relationship","value":null,"menu_order":1,"instructions":"","required":0,"id":"","class":"","conditional_logic":0,"parent":9772,"wrapper":{"width":"","class":"","id":""},"post_type":["authors"],"post_status":["publish"],"taxonomy":"","filters":["search"],"return_format":"id","min":0,"max":10,"allow_in_bindings":0,"elements":["featured_image"],"bidirectional":0,"bidirectional_target":[],"_name":"authors","_valid":1}},"description":{"simple_value_formatted":"","value_formatted":"","value":"","field":{"ID":9776,"key":"field_66d2183027749","label":"Description","name":"description","aria-label":"","prefix":"acf","type":"wysiwyg","value":null,"menu_order":3,"instructions":"","required":0,"id":"","class":"","conditional_logic":0,"parent":9772,"wrapper":{"width":"","class":"","id":""},"default_value":"","allow_in_bindings":0,"tabs":"all","toolbar":"basic","media_upload":0,"delay":1,"_name":"description","_valid":1}}},"jetpack_featured_media_url":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-content\/uploads\/sites\/76\/2019\/04\/Artificial-intelligence.jpg","jetpack_sharing_enabled":true,"amp_enabled":true,"mfb_rest_fields":["title","jetpack_featured_media_url","jetpack_sharing_enabled","amp_enabled"],"_links":{"self":[{"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/posts\/5501","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/users\/2533"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/comments?post=5501"}],"version-history":[{"count":0,"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/posts\/5501\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/media\/5505"}],"wp:attachment":[{"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/media?parent=5501"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/categories?post=5501"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.qub.ac.uk\/qpol\/wp-json\/wp\/v2\/tags?post=5501"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}