Questions? +1 (202) 335-3939 Login
Trusted News Since 1995
A service for political researchers · Monday, April 7, 2025 · 800,934,931 Articles · 3+ Million Readers

The $600 Billion Question: Election Town Hall Tackles Australia's AI Future

As major parties stay silent on AI, this week’s town hall brings critical trust and safety questions into election focus.

CANBERRA, AUSTRALIA, April 7, 2025 /EINPresswire.com/ -- On Wednesday, 9 April, prominent AI experts and concerned citizens will gather at a Virtual Town Hall to discuss critical AI policy issues facing Australia's next government. Greens AI Spokesperson David Shoebridge has accepted his invitation. Labor and Liberal AI spokespeople, Ed Husic and Sussan Ley, have also been invited.

As artificial intelligence advances rapidly, decisions made in Australia's next Parliament will determine whether AI becomes a powerful force for good or causes catastrophic harm. This Virtual Town Hall will explore how Australia can ensure AI systems are developed safely while capturing a potential $600 billion economic benefit.

Over 200 experts, public figures, and concerned citizens have already endorsed the open letter calling for the next government to establish an AI Safety Institute (AISI) and introduce an AI Act with mandatory guardrails on AI developers and deployers.

At the event, a panel of AI experts from industry and academia will discuss the risks and opportunities of AI, the economic implications of high and low public trust in AI, and why Australian businesses want an AI Act. The goal is a frank discussion that brings in a broad set of perspectives, not only those that align with the open letter.

"AI is not a distant concern," says Hunter Jay, the former CEO of Ripe Robotics. "It is entirely possible we see AIs that are more capable than humans during the next term of government. The trend towards more capable systems doesn't seem to be slowing. If anything, progress is accelerating as models begin to assist with building their own successors.”

Greg Sadler, coordinator of Australians for AI Safety, added, "Australians have been increasingly saying that the major parties aren't demonstrating a vision for the future. AI policy is one way for politicians to show they have a plan and are thinking about the long term."

Jisoo Kim, founder of CLEAR AI, emphasises the business perspective: "Australian businesses want clear AI regulation. When you buy a car in Australia, you trust it's passed rigorous safety checks. You're not expected to crash-test it yourself or be liable if the brakes fail. The same should apply to AI. That way, businesses can focus on innovation and growth - not just risk management."

While Australia committed to creating an AI Safety Institute when signing the Seoul Declaration and Bletchley Declaration, it remains the only signatory yet to fulfil this pledge.

During last year's Senate Select Committee Inquiry into AI Adoption, Independent Senator David Pocock recommended that an AI institute be created, but neither major party took a definitive position in the final report.

To help voters know where the parties stand Australians for AI Safety has released an interactive AI safety scorecard showing a variety of support across parties:

- The Australian Greens strongly support an AI Act with regulatory oversight
- The Animal Justice Party endorse both an AISI and AI legislation
- The Libertarian Party opposes an AISI as a "government scheme"

Notably absent are positions from both the Coalition and Labor who are yet to announce concrete AI safety policies in an election where there's a high chance of a hung parliament—a concerning gap at a time when Australian voters are demanding long-term vision and both major parties face decades of declining primary votes.

Public sentiment is clear: A 2024 survey from the University of Queensland shows 8 in 10 Australians believe the country should lead in international AI governance, while 9 in 10 support creating a dedicated regulatory body. The same survey found that preventing dangerous and catastrophic AI outcomes is identified as the top priority, with 8 in 10 Australians believing preventing AI-driven extinction should be a global priority. A 2023 Ipsos survey found Australians to be the most concerned about AI globally.

The virtual town hall is open to public participation. Register at AustraliansForAISafety.com.au to submit questions and engage with experts and candidates.

After the Town Hall, Australians for AI Safety will formally send their open letter to candidates and finalise an election scorecard for distribution. The letter will remain open for support by Australian experts and members of the public until election day.


*AI Experts Panel*

Kimberlee Weatherall
Professor of Law at the University of Sydney, a Chief Investigator with the ARC Centre of Excellence for Automated Decision-Making and Society, and Co-Director of the University of Sydney's Centre for AI, Trust and Governance.

Liam Carroll
Researcher in AI Safety at the Gradient Institute. Liam completed his Master's in Mathematics at the University of Melbourne. He does a range of research and practical work focusing on ensuring powerful AI systems are both beneficial and safe.

Jisoo Kim
Co-Founder of Clear AI. She has a Masters of National Security Policy from the ANU and has had various roles in Canberra, including as Digital Media Adviser to former Prime Minister Scott Morrison.

Jess Graham
Researcher at the University of Queensland and has a Bachelor of Psychological Sciences. She's completed research into Australian public priorities for AI development and regulation, and is currently working on AI risks at MIT.

Mr Gregory Sadler
Good Ancestors Policy
+61 401 534 879
email us here
Visit us on social media:
LinkedIn

Powered by EIN Presswire

Distribution channels: Human Rights, IT Industry, Politics, Technology

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Submit your press release