Back to glossary

QUIC

A transport protocol originally developed by Google that provides multiplexed connections over UDP with built-in TLS encryption. QUIC eliminates head-of-line blocking, supports connection migration across network changes, and reduces connection establishment latency.

QUIC reimagines internet transport for the modern web. By building on UDP rather than TCP, it avoids the ossified middlebox behavior that makes TCP improvements difficult to deploy. QUIC implements reliability, congestion control, and flow control in user space, enabling faster protocol evolution. Connection migration allows ongoing transfers to survive network changes, such as switching from Wi-Fi to cellular.

For AI product teams, QUIC's benefits are most visible in mobile and global scenarios. AI-powered mobile applications that maintain persistent connections for real-time features like live transcription, continuous recommendations, or streaming AI responses benefit from QUIC's connection migration, which prevents disconnections when users move between networks. Growth teams focused on mobile engagement metrics will see reduced connection failures and faster reconnection times. QUIC's zero-round-trip connection resumption also benefits repeat visitors, making subsequent visits to AI-powered web applications feel noticeably faster, which directly impacts return user engagement metrics.

Related Terms

Content Delivery Network

A geographically distributed network of proxy servers that caches and delivers content from locations closest to end users. CDNs reduce latency, improve load times, and absorb traffic spikes by serving content from edge nodes rather than a single origin server.

Edge Computing

A distributed computing paradigm that processes data closer to the source of generation rather than in a centralized data center. Edge computing reduces latency, conserves bandwidth, and enables real-time processing for latency-sensitive applications.

Serverless Computing

A cloud execution model where the provider dynamically manages server allocation and scaling. Developers deploy functions or containers without provisioning infrastructure, paying only for actual compute time consumed rather than reserved capacity.

Function as a Service

A serverless computing category where developers deploy individual functions that execute in response to events. FaaS platforms like AWS Lambda, Google Cloud Functions, and Azure Functions handle all infrastructure management, scaling each function independently.

Platform as a Service

A cloud computing model that provides a complete development and deployment environment without managing underlying infrastructure. PaaS offerings like Heroku, Vercel, and Google App Engine handle servers, storage, networking, and runtime configuration.

Infrastructure as a Service

A cloud computing model that provides virtualized computing resources over the internet. IaaS offerings like AWS EC2, Google Compute Engine, and Azure Virtual Machines give teams full control over servers, storage, and networking without owning physical hardware.