{"id":4190,"date":"2016-08-11T19:13:46","date_gmt":"2016-08-11T19:13:46","guid":{"rendered":"http:\/\/www.odbms.org\/blog\/?p=4190"},"modified":"2016-08-19T17:53:29","modified_gmt":"2016-08-19T17:53:29","slug":"machines-of-loving-grace-interview-with-john-markoff","status":"publish","type":"post","link":"https:\/\/www.odbms.org\/blog\/2016\/08\/machines-of-loving-grace-interview-with-john-markoff\/","title":{"rendered":"Machines of Loving Grace. Interview with John Markoff."},"content":{"rendered":"<blockquote><p><strong>&#8220;Intelligent system designers do have ethical responsibilities.&#8221;<br \/>\n&#8211;John Markoff.<\/strong><\/p><\/blockquote>\n<p>I have interviewed\u00a0<strong>John Markoff<\/strong>, <em>technology writer at\u00a0The New York Times.\u00a0<br \/>\n<\/em>In\u00a02013 he was awarded a Pulitzer Prize.<br \/>\nThe interview is related to his recent book\u00a0<em>\u201cMachines of Loving Grace: The Quest for Common Ground Between Humans and Robots,\u00a0<\/em>published in August of 2015 by HarperCollins Ecco.<\/p>\n<p>RVZ<\/p>\n<p><strong>Q1. Do you share the concerns of prominent technology leaders such\u00a0as Tesla\u2019s chief executive, Elon Musk, who suggested we might need to\u00a0regulate the development of artificial intelligence?<\/strong><\/p>\n<p><strong>John Markoff:\u00a0<\/strong>I share their concerns, but not their assertions that we may be on the cusp of some kind of singularity or rapid advance to artificial general intelligence. I do think that machine autonomy raises specific ethical and safety concerns and regulation is an obvious response.<\/p>\n<p><strong>Q2. How difficult is it to reconcile the different interests of the people\u00a0who are involved in a direct or indirect way in developing and deploying\u00a0new technology?<\/strong><\/p>\n<p><strong>John Markoff:\u00a0<\/strong>This is why we have governments and governmental regulation. I think <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/Artificial_intelligence');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/Artificial_intelligence\" target=\"_blank\">AI<\/a>, in that respect is no different than any other technology. It should and can be regulated when human safety is at stake.<\/p>\n<p><strong>Q3. In your book Machines of Loving Grace\u00a0you\u00a0argued that &#8220;we must decide to design ourselves into our future, or risk<\/strong>\u00a0<strong>being excluded from it altogether&#8221;. What do you mean by that?<\/strong><\/p>\n<p><strong>John Markoff: <\/strong>You can use AI technologies either to automate or to augment humans. The problem is minimized when you take an approach that is based on <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/User-centered_design');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/User-centered_design\" target=\"_blank\">human centric design<\/a> principles.<\/p>\n<p><strong>Q4. How is it possible in practice? Isn&#8217;t the technology space dominated\u00a0by giants such as IBM, Apple,Google who dictate the direction of new\u00a0technology?<\/strong><\/p>\n<p><strong>John Markoff: \u00a0<\/strong>This is a very interesting time with \u201cgiant\u201d technology companies realizing that there are consequences in the deployment of these technologies. Google, IBM and Microsoft have all recently made public commitments to the safe use of AI.<\/p>\n<p><strong>Q5. What are the most significant new developments in the humans-computers\u00a0area, that are likely to have a significant influence in our daily life in\u00a0the near future?<\/strong><\/p>\n<p><strong>John Markoff: \u00a0<\/strong>One of the best things about being a reporter is that you don\u2019t have to predict the future. You only have to note what the various visionaries say, so you can call that to their attention when their predictions prove inaccurate. With that caveat, if I am forced to bet on any particular information technology it would be augmented reality. This is because I believe that multi-touch interfaces for mobile devices simply can\u2019t be the last step in user interface.<\/p>\n<p><strong>Q6. Do you believe that robots will really transform modern life?<\/strong><\/p>\n<p><strong>John Markoff: \u00a0<\/strong>I struggle with the definition of what is a \u201crobot.\u201d If something is tele-operated, for example, is it a robot? That said I think that we will increasingly be surrounded by machines that perform tasks.<br \/>\nThe question is will they come as quickly as Silicon Valley seems to believe. My friend <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/Paul_Saffo');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/Paul_Saffo\" target=\"_blank\">Paul Saffo<\/a> has said, <em>\u201cNever mistake a clear view for a short distance.\u201d<\/em> And I think that is the case with all kinds of <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/Mobile_robot');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/Mobile_robot\" target=\"_blank\">mobile robots<\/a>, including <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/Autonomous_car');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/Autonomous_car\" target=\"_blank\">self driving cars<\/a>.<\/p>\n<p><strong>Q7. For the designers of Intelligent Systems, how difficult is to draw a\u00a0line between what is human and what is machine?<\/strong><\/p>\n<p><strong>John Markoff: \u00a0<\/strong>I feel strongly that the possibility of designing cyborgs, particularly with respect to intellectual prosthesis is a boundary we should cross with great caution. Remember the <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/Borg_(Star_Trek)');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/Borg_(Star_Trek)\" target=\"_blank\">Borg from StarTrek<\/a>. <em>\u201cResistance is futile, you will be assimilated.\u201d<\/em> I think the challenge is to use these systems to enhance human thought, not for social control.<\/p>\n<p><strong>Q8. What are the ethical responsibilities of designers of intelligent\u00a0systems?<\/strong><\/p>\n<p><strong>John Markoff: <\/strong>I think the most important aspect of that question is the simple acknowledgement that intelligent system designers do have ethical responsibilities. That has not always been the case, but it seems to be a growing force within the community of AI and robotics designers in the past five years, so I\u2019m not entirely pessimistic.<\/p>\n<p><strong>Q9. If humans delegate decisions to machines, who will be responsible for\u00a0the consequences?<\/strong><\/p>\n<p><strong>John Markoff:\u00a0<\/strong><a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/en.wikipedia.org\/wiki\/Ben_Shneiderman');\"  href=\"https:\/\/en.wikipedia.org\/wiki\/Ben_Shneiderman\" target=\"_blank\">Ben Shneiderman<\/a>, the University of Maryland computer scientist and user interface designer has written eloquently on this point. Indeed he argues against autonomous systems for precisely this reason. His point is that it is essential to keep a human in the loop.\u00a0If not you run the risk of abdicating ethical responsibility for system design.<\/p>\n<p><strong>Q10. Assuming there is a real potential in using data\u2013driven methods to\u00a0both help charities develop better services and products, and understand\u00a0civil society activity. In your opinion, what are the key lessons and\u00a0recommendations for future work in this space?<\/strong><\/p>\n<p><strong>John Markoff: <\/strong>I\u2019m afraid I\u2019m not an expert in the IT needs of either charities or NGOs. That said a wide range of AI advances are already being delivered at nominal cost via smart phones. As cheap sensors proliferate virtually all everyday objects will gain intelligence that will be widely accessible.<\/p>\n<p><strong>Qx. Anything else you wish to add?<\/strong><\/p>\n<p><strong>John Markoff: <\/strong>Only that I think it is interesting that the augmentation vs automation dichotomy is increasingly seen as a path through which to navigate the impact of these technologies. Computer system designers are the ones who will decide what the impact of these technologies are and whether to replace or augment humans in society.<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;-<\/p>\n<p><strong>JOHN GREGORY MARKOFF<\/strong><\/p>\n<p><em>John Markoff joined The New York Times in March 1988 as a reporter for the business section. He is now a technology writer based in San Francisco bureau of the paper. Prior to joining the Times, he worked for The San Francisco Examiner from 1985 to 1988. He reported for the New York Times Science Section from 2010 to 2015.<\/em><\/p>\n<p><em>Markoff has written about technology and science since 1977. He covered technology and the defense industry for The Pacific News Service in San Francisco from 1977 to 1981; he was a reporter at Infoworld from 1981 to 1983; he was the West Coast editor for Byte Magazine from 1984 to 1985 and wrote a column on personal computers for The San Jose Mercury from 1983 to 1985.<\/em><\/p>\n<p><em>He has also been a lecturer at the University of California at Berkeley School of Journalism and an adjunct faculty member of the Stanford Graduate Program on Journalism.<\/em><\/p>\n<p><em>The Times nominated him for a Pulitzer Prize in 1995, 1998 and 2000. The San Francisco Examiner nominated him for a Pulitzer in 1987. In 2005, with a group of Times reporters, he received the Loeb Award for business journalism. In 2007 he shared the Society of American Business Editors and Writers Breaking News award. In 2013 he was awarded a Pulitzer Prize in explanatory reporting as part of a New York Times project on labor and automation.<\/em><\/p>\n<p><em>In 2007 he became a member of the International Media Council at the World Economic Forum. Also in 2007, he was named a fellow of the Society of Professional Journalists, the organization\u2019s highest honor.<\/em><\/p>\n<p><em>In June of 2010 the New York Times presented him with the Nathaniel Nash Award, which is given annually for foreign and business reporting.<\/em><\/p>\n<p><em>Born in Oakland, California on October 29, 1949, Markoff grew up in Palo Alto, California and graduated from Whitman College, Walla Walla, Washington, in 1971. He attended graduate school at the University of Oregon and received a masters degree in sociology in 1976.<\/em><\/p>\n<p><em>Markoff is the co-author of \u201cThe High Cost of High Tech,\u201d published in 1985 by Harper &amp; Row.\u00a0He wrote \u201cCyberpunk: Outlaws and Hackers on the Computer Frontier\u201d with Katie Hafner, which was published in 1991 by Simon &amp; Schuster.<br \/>\nIn January of 1996 Hyperion published &#8220;Takedown: The Pursuit and Capture of America&#8217;s Most Wanted Computer Outlaw,&#8221; which he co-authored with Tsutomu Shimomura. \u201cWhat the Dormouse Said: How the Sixties Counterculture shaped the Personal Computer Industry,\u201d was published in 2005 by Viking Books. \u201cMachines of Loving Grace: The Quest for Common Ground Between Humans and Robots,\u201d was published in August of 2015 by HarperCollins Ecco.<\/em><\/p>\n<p><em>He is currently researching a biography of Stewart Brand.<\/em><\/p>\n<p><em>He is married to Leslie Terzian Markoff and they live in San Francisco, Calif.<\/em><\/p>\n<p><strong>Resources<\/strong><\/p>\n<p><a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/www.harpercollins.com\/9780062266705\/machines-of-loving-grace');\"  href=\"http:\/\/www.harpercollins.com\/9780062266705\/machines-of-loving-grace\" target=\"_blank\">MACHINES OF LOVING GRACE &#8211; The\u00a0Quest for Common Ground Between Humans and Robots<\/a> By John Markoff,\u00a0Illustrated. 378 pp. Ecco\/HarperCollins Publishers.<\/p>\n<p>&#8211;<a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/faculty.washington.edu\/jtenenbg\/courses\/360\/f04\/sessions\/schneidermanGoldenRules.html');\"  href=\"https:\/\/faculty.washington.edu\/jtenenbg\/courses\/360\/f04\/sessions\/schneidermanGoldenRules.html\" target=\"_blank\">Shneiderman&#8217;s &#8220;Eight Golden Rules of Interface Design&#8221;<\/a>. These rules were obtained from the text Designing the User Interface by Ben Shneiderman. <\/p>\n<p>&#8211; <a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/cs.umd.edu\/hcil\/DTUI6\/');\"  href=\"http:\/\/cs.umd.edu\/hcil\/DTUI6\/\" target=\"_blank\">&#8220;Designing the User Interface&#8221;, 6th Edition.<\/a> This is a revised edition of the highly successful textbook on Human Computer Interaction originally developed by Ben Shneiderman and Catherine Plaisant at the University of Maryland.<\/p>\n<p><strong>Related Posts<\/strong><\/p>\n<p>&#8211;\u00a0<a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/www.odbms.org\/blog\/2016\/04\/recruit-institute-of-technology-interview-with-alon-halevy\/');\"  href=\"http:\/\/www.odbms.org\/blog\/2016\/04\/recruit-institute-of-technology-interview-with-alon-halevy\/\" target=\"_blank\" rel=\"nofollow\">Recruit Institute of Technology. Interview with Alon Halevy<\/a>.\u00a0<span class=\"feed-source\">\u00a0ODBMS Industry Watch,\u00a0<\/span><span class=\"feed-date\">Published on 2016-04-02<\/span><\/p>\n<p>\u2013\u00a0<a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/www.odbms.org\/2016\/02\/civility-in-the-age-of-artificial-intelligence\/');\"  href=\"http:\/\/www.odbms.org\/2016\/02\/civility-in-the-age-of-artificial-intelligence\/\" target=\"_blank\">Civility in the Age of Artificial Intelligence<\/a>,\u00a0\u00a0by\u00a0STEVE LOHR,\u00a0<em>technology reporter for The New York Times, <\/em>ODBMS.org<\/p>\n<p>\u2013\u00a0<a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/www.odbms.org\/blog\/2016\/01\/on-artificial-intelligence-and-society-interview-with-oren-etzioni\/');\"  href=\"http:\/\/www.odbms.org\/blog\/2016\/01\/on-artificial-intelligence-and-society-interview-with-oren-etzioni\/\" target=\"_blank\">On Artificial Intelligence and Society. Interview with Oren Etzioni,<\/a>\u00a0ODBMS Industry Watch.<\/p>\n<p>\u2013\u00a0<a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/www.odbms.org\/blog\/2016\/01\/on-big-data-and-society-interview-with-viktor-mayer-schonberger\/');\"  href=\"http:\/\/www.odbms.org\/blog\/2016\/01\/on-big-data-and-society-interview-with-viktor-mayer-schonberger\/\" target=\"_blank\" rel=\"nofollow\">On Big Data and Society. Interview with Viktor Mayer-Sch\u00f6nberger<\/a>,\u00a0<span class=\"feed-source\">ODBMS Industry Watch.<\/span><\/p>\n<p><strong>Follow us on Twitter:\u00a0<a onclick=\"javascript:pageTracker._trackPageview('\/outgoing\/twitter.com\/odbmsorg');\"  href=\"https:\/\/twitter.com\/odbmsorg\" target=\"_blank\">@odbmsorg<\/a><\/strong><\/p>\n<p># #<\/p>\n<!-- AddThis Advanced Settings generic via filter on the_content --><!-- AddThis Share Buttons generic via filter on the_content -->","protected":false},"excerpt":{"rendered":"<p>&#8220;Intelligent system designers do have ethical responsibilities.&#8221; &#8211;John Markoff. I have interviewed\u00a0John Markoff, technology writer at\u00a0The New York Times.\u00a0 In\u00a02013 he was awarded a Pulitzer Prize. The interview is related to his recent book\u00a0\u201cMachines of Loving Grace: The Quest for Common Ground Between Humans and Robots,\u00a0published in August of 2015 by HarperCollins Ecco. RVZ Q1. [&hellip;]<!-- AddThis Advanced Settings generic via filter on get_the_excerpt --><!-- AddThis Share Buttons generic via filter on get_the_excerpt --><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[990,993,989,224,987,263,977,984,986,383,991,985,913,992,988,978,994],"_links":{"self":[{"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/posts\/4190"}],"collection":[{"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/comments?post=4190"}],"version-history":[{"count":17,"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/posts\/4190\/revisions"}],"predecessor-version":[{"id":4225,"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/posts\/4190\/revisions\/4225"}],"wp:attachment":[{"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/media?parent=4190"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/categories?post=4190"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.odbms.org\/blog\/wp-json\/wp\/v2\/tags?post=4190"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}