OnIT Solutions Logo

Initializing AI Systems

AI & MSP News
2 May 2026
10 min read

Amazon Bedrock Launches OpenAI Models gpt-oss-120b and gpt-oss-20b

The deployment of OpenAI’s gpt-oss-120b and gpt-oss-20b models on AWS marks a pivotal shift, granting organisations the ability to run high-performance reasoning models with complete control over their infrastructure. This launch ensures that OpenAI models on AWS are no longer limited to standard API calls…

OnIT Solutions blog post featured image

OpenAI Models on AWS Now Available via Bedrock

The deployment of OpenAI’s gpt-oss-120b and gpt-oss-20b models on AWS marks a pivotal shift, granting organisations the ability to run high-performance reasoning models with complete control over their infrastructure. This launch ensures that OpenAI models on AWS are no longer limited to standard API calls but are now available as open weight options. By integrating these models, businesses can achieve a balance between sophisticated text generation and the rigorous security requirements of a private cloud environment.

Flexible Access Through Amazon Bedrock and SageMaker JumpStart

Developers can now access these models through two primary pathways: Amazon Bedrock and SageMaker JumpStart. Bedrock offers a streamlined, OpenAI-compatible endpoint that allows teams to use their existing OpenAI SDK code or transition to the Bedrock Converse API. This flexibility reduces the technical debt typically associated with switching AI providers and allows for rapid experimentation without extensive code rewrites.

For organisations that require deeper customisation, SageMaker JumpStart provides a robust AI infrastructure for evaluating and comparing the performance of the gpt-oss-120b and gpt-oss-20b models. These tools enable IT managers to test reasoning tasks in a sandbox environment before moving to full-scale production. This level of oversight is essential for ensuring that AI applications remain performant and cost-effective as they scale.

Scalable AI Solutions for Australian Businesses

Australian businesses are now positioned to build highly scalable applications that leverage the reasoning capabilities of gpt-oss-120b. Whether developing customer-facing support agents or complex internal data analysis tools, local firms can experiment with these models to find the right fit for their specific operational needs. This local availability ensures that sensitive data remains within a governed environment while still accessing world-class AI power.

The transition to open weight models gives companies the freedom to manage their underlying data and compute resources more granularly. This is a critical component for any forward-thinking ai strategy, especially in industries where data sovereignty and compliance are non-negotiable. By maintaining control over the model weights, businesses can ensure their AI assets remain secure and fully integrated into their existing workflows.

Leveraging these foundation models within a familiar ecosystem allows for a more cohesive approach to digital transformation. This integration also simplifies the technical requirements for developers, as they can now manage their AI models and cloud infrastructure through a single, unified interface.

Seamless Integration with Amazon Bedrock and SageMaker

Developers can now point their existing OpenAI SDK scripts directly to an Amazon Bedrock endpoint, eliminating the need for major infrastructure overhauls or complex code migrations. This OpenAI-compatible endpoint ensures that technical teams already familiar with the library can deploy OpenAI models on AWS with minimal friction. By supporting both the standard SDK and the native Bedrock Converse API, AWS provides a "plug-and-play" approach to high-performance AI. This interoperability is essential for companies looking to maintain a versatile cloud solutions environment without being tethered to a single development framework.

Optimising AI Infrastructure with SageMaker JumpStart

Before moving to a full-scale rollout, organisations can use SageMaker JumpStart to thoroughly evaluate and compare the gpt-oss-120b and 20b models against their specific business use cases. This specialised environment provides the necessary tools to customise these open weight models, ensuring they align perfectly with internal data and performance benchmarks. Once a model is refined through testing, it can be deployed into production via the SageMaker AI console or the SageMaker Python SDK. This level of granular control over AI infrastructure allows for precise fine-tuning that traditional, closed-API services often lack.

Strategic Agility Through Model Switching

The ability to switch between different AI providers without rewriting underlying code transforms model selection into a distinct strategic advantage for Australian businesses. Companies are no longer locked into a single vendor's roadmap, allowing them to pivot as new innovations emerge from leading AI developers. This flexibility helps IT managers ensure their ai strategy remains resilient against market shifts and technical obsolescence. By decoupling the application logic from the specific model, teams can prioritise performance and cost-efficiency in real-time as project requirements evolve.

Deploying these models within the AWS ecosystem also simplifies the security layer, as businesses can leverage existing IAM roles and VPC configurations for model access. Integrating high-performance reasoning models into current workflows does not require a compromise on cybersecurity protocols. Whether you are automating internal documentation or building a customer-facing service, the underlying infrastructure remains governed and secure. This seamless bridge between raw AI power and enterprise-grade reliability is what sets this integration apart for professional environments.

AWS continues to expand its selection of foundation models, and the inclusion of gpt-oss models provides a robust alternative to proprietary, black-box APIs. Organisations can now mix and match capabilities, using the most efficient model for each specific task within their application stack. This modular approach to development encourages continuous evolution and experimentation across the entire business. This deep technical integration serves as a vital foundation for even more complex investments and partnerships designed to bring AI to an unprecedented scale.

The $50 Billion Amazon and OpenAI Strategic Partnership

Amazon is committing $50 billion to its partnership with OpenAI, a massive capital injection designed to accelerate the development of advanced, bespoke artificial intelligence for global enterprises. This investment signals a major shift from a simple vendor relationship to a deep strategic alignment aimed at delivering OpenAI models on AWS at an unprecedented scale. By combining OpenAI’s sophisticated reasoning capabilities with Amazon’s global footprint, both companies are positioning themselves to lead the next era of industrial AI applications.

Scaling AI Infrastructure with Bespoke Models

Sam Altman, CEO of OpenAI, has highlighted that this collaboration will put powerful AI tools into the hands of users at "real scale." The partnership focuses on optimizing AI infrastructure to handle the massive compute requirements of models like the gpt-oss-120b. This ensures that as businesses transition from experimental pilots to full-scale production, the underlying hardware can meet the demand for low-latency, high-accuracy reasoning. For Australian businesses, this means access to enterprise-grade stability for even the most complex automation tasks.

These tailored models will specifically power Amazon’s own customer-facing applications and digital agents. By customizing OpenAI’s architecture, Amazon teams can create specialized experiences that complement their existing Nova model family. This multi-model approach allows organizations to select the best tool for each specific task, whether it involves natural language processing or complex logical reasoning. It also provides a diverse toolkit for developers who need to build at scale without hitting the performance ceilings often found in generic models.

Developing a Resilient AI Strategy

For Australian business owners and IT managers, this partnership provides a clear roadmap for a long-term ai strategy. Using Amazon Bedrock as a central foundation, companies can now deploy OpenAI models on AWS alongside other industry-leading model families. This ensures that technical teams have the flexibility to adapt their tech stack as the landscape evolves, rather than being locked into a single ecosystem. Having access to multiple model architectures allows for better cost optimization and precise performance tuning across different business units.

Building Secure Agents for Customer Interaction

A major component of this collaboration involves the development of autonomous agents capable of managing customer interactions directly. These agents require the high availability and rigorous security standards that only a tier-one cloud provider can offer. Organizations can leverage professional services for ai agent deployment to ensure these tools are implemented securely within their local environments. By combining OpenAI’s reasoning with AWS’s global presence, these agents can handle complex workflows that were previously too difficult to automate reliably.

The ability to customize these models within SageMaker JumpStart provides a level of control that was previously difficult to achieve with closed-API models. IT managers can now fine-tune model parameters to align with specific data sovereignty requirements and performance benchmarks. This level of technical oversight is essential for maintaining a competitive edge while ensuring that all automated processes remain transparent and compliant. This new level of control over model weights and infrastructure allows for a more secure and predictable path toward total digital transformation.

Future-Proofing AI Infrastructure for Australian Business

The introduction of stateful runtime environments within the AWS ecosystem represents a paradigm shift for developers building persistent, context-aware AI applications. AWS claims this technological advancement will fundamentally change what is possible for organisations looking to move beyond simple chat interfaces and into complex, multi-step agentic workflows. For local enterprises, this means OpenAI models on AWS can now maintain consistent states across sessions, significantly improving the reliability of automated systems. This capability is vital for Australian firms exploring AI agent deployment strategies designed to handle ongoing customer interactions or long-term data analysis.

Data Sovereignty Through Open Weight Control

Maintaining absolute control over sensitive corporate data is a non-negotiable priority for local IT managers, especially when leveraging high-performance models like gpt-oss-120b. By utilising open weight models, organisations retain full ownership of their AI infrastructure and the data processed within it. Unlike traditional closed-API services, these models allow businesses to manage their own model weights and security parameters within Amazon Bedrock. This level of technical oversight offers several strategic benefits for the modern enterprise:

  • Granular Security: Businesses can apply custom security layers directly to the model environment.
  • Infrastructure Autonomy: IT teams can decide exactly where and how the models are hosted to meet compliance standards.
  • Performance Tuning: Developers can use SageMaker JumpStart to fine-tune models for specific industry tasks without exposing training data to external vendors.

This autonomy ensures that proprietary information remains within a governed environment, directly addressing Australian data sovereignty and cybersecurity requirements. By choosing open weights, firms are no longer reliant on the opaque updates or data handling policies of external providers. Instead, they can treat their AI models as core internal assets that are as secure as their primary database or local servers. This control is essential for industries like finance and healthcare where data privacy is mandated by law.

Strategic Agility for IT Managers

The partnership between AWS and OpenAI empowers local IT leaders to continuously evolve their AI strategy as leading innovators release new breakthroughs. By integrating these tools into their existing cloud solutions, businesses can avoid vendor lock-in and pivot as the technological landscape shifts. Access to both the gpt-oss-120b and gpt-oss-20b models allows teams to scale their reasoning capabilities based on the complexity of the task at hand. This flexibility ensures that an organisation’s technology stack remains resilient and ready to adopt the next generation of generative AI advancements.

As Sam Altman noted, this collaboration puts powerful AI into the hands of users at a "real scale," enabling the creation of customised models that serve customers more effectively. By leveraging these stateful environments and open weight models, Australian enterprises can build a foundation that is capable of adapting to future disruptions. This strategic approach to infrastructure management ensures that today's investments remain relevant even as the pace of AI innovation continues to accelerate globally. Such a foundation allows businesses to focus on delivering value rather than worrying about the underlying technical limitations of their AI tools.

Frequently Asked Questions

Which OpenAI models are currently available on AWS?

The OpenAI gpt-oss-120b and gpt-oss-20b open weight models are now available. These models are specifically designed for text generation and reasoning tasks within the AWS ecosystem.

How can I access OpenAI models on Amazon Bedrock?

You can access these models via an OpenAI-compatible endpoint in Bedrock. Developers can point the standard OpenAI SDK to this endpoint or utilise the Bedrock InvokeModel and Converse APIs.

What is the benefit of using open weight models on AWS?

Open weight models provide organisations with complete control over their infrastructure and data. This allows for greater customisation through SageMaker JumpStart and ensures that businesses can manage their AI applications according to specific security and performance requirements.

Sources

Future-Proof Your Business with OnIT Solutions

Staying on top of AI and technology trends is critical for Australian SMBs. Our team helps you cut through the noise and implement the right solutions for your business. Talk to our AI Strategy team about what today's developments mean for your organisation — or explore our full range of Managed IT Services.

Let's chat on WhatsApp

How can I help you? :)