Where minds meet machines:
World Legal Summit 2019 recap
Summary writer: Anne Pickering
Hosted simultaneously in 23 countries across the world, the 2019 World Legal Summit, held on 1 August 2019, brought together world-leading experts, practitioners and scholars representing legal practice, academia and industry.
These specialists shared their knowledge on emerging technologies, legislative insights and regulatory structures and best practice for dealing with challenges in a global context.
This year, UQ Law in collaboration with the Centre for Legal Innovation (Australia, New Zealand and Asia Pacific) at The College of Law, held a dedicated event in Brisbane that was also live-streamed across the world, with delegates from Europe, North America, South America and the Asia Pacific tuning in throughout the day.
A wide range of issues, from identity and personal governance to cybersecurity and autonomous machines, were covered by the speakers and lively discussion was facilitated by the panellists who engaged with audience members on these topics.
Read on for the key takeaways of each session.
Identity and personal governance — Redefining systems and processes in the world of “big brother”.
Presented by Bill Singleton, Partner, HWL Ebsworth
Panellists: Heinz Lepahe, Partner, HWL Ebsworth | Nicole Stephensen, Principal Consultant, Ground Up Consulting Pty Ltd and Executive Director, Privacy & Data Protection, IoT Security Institute.
- This session tackled the complicated intersection between technology and privacy.
- It examined how ‘identity’ is comprised in both a physical and digital sense.
- Panel discussion concerned practical issues relating to employment law and identity governance.
- Key takeaway: regulatory and technological solutions can be used to address the issues that arise in the nexus between law and technology. Both approaches have an important role to play in finding solutions for issues concerning identity and artificial intelligence.
This session provided an introduction to the vexing subject of identity, how aspects of traditional identity are changing and the impact of artificial intelligence (AI) on identity. The primary focus of the session was the need to respond to the regulatory challenges due to changing identity, and the importance for legislation and policy to keep up with developments around the world — both within the digital world and within our own jurisdictions.
The presenter, Bill Singleton from HWL Ebsworth noted that the final report of the ACCC Digital Platforms Enquiry (July 2019), highlighted that many innovative digital technologies such as collection of location data and facial recognition push the boundaries of the regulatory environment.
Furthermore, he suggested the collection and use of data now goes beyond the traditional definition of personal information and personal data.
So, how does ‘identity’ fit in? Well, according to Singleton, questions around ‘what identity is’ are hard to answer because many things that identify us physically — for example, hair, eye colour, voice — change over time. The only aspect of our physical identity that does not typically change are our fingerprints.
Singleton also looked at how our digital identity has taken over our traditional identity, even though it’s difficult to know who, or what, is behind a digital identity – a person, a robot, an entity? Thus, the concept of identity in the digital world is still a work in progress.
Singleton outlined emerging themes from the conundrum of this changing identity which included privacy concerns, the failure of existing regulatory frameworks and the need for review, the collection of data and informed consent, the moral dilemma surrounding facial recognition technology, and questions of liability surrounding AI.
Singleton’s presentation was followed by a panel discussion which focused on practical issues relating to employment law and identity governance.
Using a recent case, Lee v Superior Wood Pty Ltd  FWC 4762 to highlight the issues surrounding the collection of biometric data (data relating to physical features and make-up such as fingerprints) and Australia’s privacy legislation. Heinz Lepahe described the practical implications of that for employers — specifically that an obligation exists under the Privacy Act 1998 for employers to ensure a sufficient process of notification and informed consent is followed.
One of the key points raised by panellist Nicole Stephenson was that employers must use good implementation approaches — informed by current statutory frameworks — to make critical decisions. This was complicated by the fact that legislation may not be sufficient to address emerging technologies issues. Employers can also use best international practice guides and standards that are available to freely.
While we grapple with concerns regarding privacy and new data collection, legislation around the world being nowhere near addressing some of the important issues relating to AI, the good news is that progressive steps are being taken to tackle these issues.
Autonomous Machines – Responsibility and Liability versus Accessibility and Convenience
Presented by Dr Alan Davidson, Senior Lecturer, University of Queensland School of Law
Zoe Eather, Consultant – Smart Regions & Mobility, Arup and Managing Director, My Smart Community | Dr Allison Fish, Senior Lecturer, University of Queensland School of Law| Professor Andry Rakotonirainy, Director, Centre for Accident Research & Road Safety – Queensland, QUT | Professor John Swinson, University of Queensland School of Law and Partner, King & Wood Mallesons.
- This session evoked questions surrounding liability when it comes to autonomous machines such as self-driving vehicles.
- It examined questions of ethics and machines – particularly as they related to AI and autonomous machines.
- Key takeaway: automated vehicles are closer to becoming a reality in our everyday lives than many of us realise. Policy and regulation would be integral to the successful integration of autonomous vehicles.
In this session, Dr Alan Davidson and a panel of experts urged legal practitioners to start considering the array of issues — legal or otherwise — that come with integrated autonomous machines, with liability and ethics being prominent ones.
Dr Davidson then examined the role ethics play in programming machines. Using the Moral Machine Test (MIT) and German ethical rules, Davidson presented a series of thought-provoking scenarios to the audience to open up the conversation around the ethical and moral dilemmas surrounding autonomous vehicles. Asking them to consider for a moment, who they would choose to save in the event of an accident between two large women and one man crossing the road together, or one female athlete, one woman and one man who were all breaking the law by crossing at a red signal.
The panel touched on a number of topics relating to autonomous technology including a pilot test of automated vehicles that is underway in Queensland, managed and coordinated by the Cooperative and Automated Vehicle Initiative (CAVI). The program which is also supported and delivered by the Department of Transport and Main Roads aims to help deliver the next generation of transport.
One of the panellists framed the approach to these issues from a legal practitioner’s perspective and suggested it was important to understand whether adaptations to existing rules or entirely new legislation were required in addressing each issue.
In framing liability the panel suggested it was important to determine how responsibility would be apportioned between man and the machine. For example, if the anti-fog equipment on a sensor does not work and an automated car crashes, can you identify what or who is responsible? Is it the vehicle manufacturer, the company that created the map, the road authority who removed a road block, or the driver who might have selected the wrong mode? How do you determine the apportionment of liability?
Another panellist suggested we could view ‘autonomous machines’ as a situation of less human interaction rather than no interaction.
The panel suggested one of the major issues with automated vehicles relates to social justice — that is, the impact on jobs by autonomous vehicles and the transition from human operated to automation models. There is also the issue of some countries not being able to absorb autonomous vehicles due to various factors such as insufficient infrastructure to support autonomous systems, the way people drive, and road safety and rules.
Cyber Security and Personal Data – Can/should we “protect” our privacy from the next breach?
Presented by Associate Professor Mark Burdon, Faculty of Law, QUT
Nicole Murdoch, Principal, EAGLEGATE Lawyers; Director, Australian Information Security Association; Member, QLS Cybersecurity Working Group | Daniel Pearson, Adviser - General Insurance, Findex Insurance Brokers Pty Ltd | Kim Trajer, Chief Operating Officer, McCullough Robertson Lawyers and Member, Queensland Law Society Innovation Committee | David Williams, Managing Director, FinTechnology
- This session established that Australia’s cybersecurity framework is fragmented and that there is no specific cybersecurity law that covers issues of this nature.
- As such it looked at some of the specific laws through which cyber-related issues can be addressed.
- Key takeaway: organisations can better manage cyber-related risks through human capital strategies that look at better training, awareness and good online security practices.
This session provided an interesting overview of Australia’s cybersecurity law framework. This included discussion on how the Privacy Act fits within the wider framework, how ‘reasonable security’ is defined under Principle 11 of the Australian Privacy Principles and it examined what constituted notifiable data breaches under the Privacy Act.
The presenter, Associate Professor Burdon raised three key points:
- The legal structures relating to cybersecurity and privacy in Australia are largely fragmented.
- The principles-based framework in the Privacy Act delegates significant responsibility to regulated entities — reasonable security (APP 11).
- The need to enhance organisational security of personal information through the Notification Data Breach Scheme (NDB), rather than relying on complaint mechanisms or civil penalties.
There is no specific cybersecurity doctrine in Australia, instead a range of other laws address cybercrime, protection of critical infrastructure, telecommunications operation and cyber-safety; there are also sector-specific guidelines and information privacy law.
Associate Professor Burdon noted that although the range of laws are conceptually similar, they are definitionally different.
In his examination of the duty of agencies or organisations covered under the Privacy Act 1988 to inform on data breaches Associate Professor Burdon focused on the core components of Australian Privacy Principles (APPs) 11. Specifically, he honed in on the six key considerations that determined whether an action constituted a security breach: misuse, interference, loss, unauthorised access, unauthorised modification and unauthorised disclosure.
In their discussion the panel suggested some approaches to help organisations manage cybersecurity issues:
- Education: educate staff about what exactly personal information is.
- Controls and protection: prevention is better than a cure. Take appropriate steps to mitigate, continually monitor threats and take steps to prevent data breaches.
- Human understanding of protection: use strong passwords, multi-factor authentication, restrict administrative staff access, provide secure remote access and monitor.
- Awareness: educate top-down on the importance of the protection of personal information, data breaches and the use of correct technology.
- The cost of breaches are significant, therefore it is important for businesses to know risks.
Other issues that need to be considered include having an appropriate notification system to let clients know of breaches, finding easier options for SMEs, and the fact that organisations do not necessarily need to store information.
While education was a recurring suggestion, some closing remarks include: a cost-effective analysis of the current state of the system, education in different industries using war-stories; and the importance of risk management to consider risk incidents.
In closing, the panellists suggested that if big companies like Google and Facebook make billions using your data, it is time to put a value to your personal information.
Data – The New Electricity or The Next Regulatory Nightmare?
Presented by Dr Mark Staples, Senior Principal Researcher, Data61 CSIRO
Jemima Harris, Director, LOD Innovation | Angus Murray, Partner and Trademark Attorney, Irish Bentley Lawyers; Co-founder and Director, The Legal Forecast; Sessional academic, University of Southern Queensland; Member, Queensland Law Society Innovation Committee; and Chair, Electronic Frontiers Australia Policy Committee | Chris Maher, Legal Counsel – Products, Marketing & IP, Legal & Corporate Affairs, Telstra.
- This session examined data and its value as an asset to the legal profession.
- It examined major challenges surrounding data regulation and how to gain insights without necessarily having to collect and store data.
- Key takeaway: Data has value because it provides insights — there needs to be a way to source insights from data while exploring ways to us data in an ethical and responsible manner.
The final session of the day, presented by Dr Mark Staples from Data61 at CSIRO, covered a range of timely topics that tied together themes from the previous sessions, including data, analytics, value and regulation challenges.
Data is an asset to businesses, but it is different from a physical asset. Dr Staples emphasised that the significance of data for businesses is in the value it creates because it provides insights. Therefore, businesses do not necessarily need to collect data, but the challenge lies in how to get insights from data without collecting it.
The panellists discussed key challenges and trends around data and made the following suggestions:
- Data is just information. How we use data determines whether it is a toxic asset or not. When data is commodified, it is a human rights issue. Education, and ethical and responsible use of data are key areas of importance.
- Using the billable hours method is not a good measure of productivity. One of the challenges of billable hours is the quality of work, how do you measure the quality of work?
- There is a lot of potential for technology to add value to society and access to justice.
- There are opportunities to use data sets to enhance processes.