Adapter pattern make things compatible, better yet, it is how you evolve systems in production without breaking client
The Adapter Pattern is a design pattern that works like a translator between two systems that don’t naturally fit together. Instead of rewriting one system to match the other, you build an adapter that converts one interface into the shape the client expects. This lets old code work with new code, or mismatched APIs work together, without either side knowing about the differences. In short, it makes incompatible things compatible without changing their source. In this post we are going to look beyond and use adapter pattern as change isolators, migration enablers and compatibility shields.
1. Adapter pattern creates a bridge between mismatched interfaces.
This code below shows how an adapter acts as a translator between two systems that “speak” different formats. The old payment gateway expects amounts in cents (like 4999), but the new payment processor works in dollars (like 49.99). Instead of rewriting the old gateway or changing every client that calls it, we build a PaymentAdapter that takes in dollars, converts them to cents, and forwards the request to the old system. This way, the rest of the app can use the modern dollar-based API, while under the hood the legacy system keeps running without breaking. It’s a clean bridge between old and new.
// Legacy API: works in cents
class OldPaymentGateway {
makePayment(amountInCents) {
console.log(`Paid ${amountInCents} cents via OldPaymentGateway`);
}
}
// New API: works in dollars
class NewPaymentProcessor {
pay(amountInDollars) {
console.log(`Paid $${amountInDollars} via NewPaymentProcessor`);
}
}
// Adapter: translates $ → cents
class PaymentAdapter {
constructor(legacyProcessor) {
this.legacyProcessor = legacyProcessor;
}
pay(amountInDollars) {
const cents = amountInDollars * 100;
this.legacyProcessor.makePayment(cents);
}
}
// Usage
const legacy = new OldPaymentGateway();
const adapter = new PaymentAdapter(legacy);
adapter.pay(49.99);
// Internally calls legacy system with 4999 cents
2. Adapter as a migration enabler
This snippet below shows how an adapter acts as a migration enabler when moving from a REST API to GraphQL. Instead of rewriting all the client code that expects fetch("/api/..."), we override fetch itself with an adapter that pretends to be REST but under the hood talks to GraphQL. This way, existing client code continues working unchanged, while the backend evolves. The adapter intercepts the REST-like URL, extracts the necessary parameters (like userId), constructs an equivalent GraphQL query, sends it to the GraphQL endpoint, and finally reshapes the response so it looks REST-like again. Clients never know the switch happened.
// --- Adapter: REST facade on top of GraphQL ---
async function fetchRestLike(url) {
// If the client calls something like /api/users/123
if (url.startsWith("/api/users/")) {
// Extract the user ID from the REST-like path
const id = url.split("/").pop();
// Define the equivalent GraphQL query
const query = `
query GetUser($id: ID!) {
user(id: $id) { id name email }
}
`;
// Send a POST request to the GraphQL endpoint instead of REST
const res = await fetch("/graphql", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ query, variables: { id } }) // inject extracted id
});
const data = await res.json();
// Adapter reshapes GraphQL response → REST-like response
// It mimics the "res.json()" method of the Fetch API
return { json: async () => data.data.user };
}
// If some other endpoint is called that we haven’t adapted yet
throw new Error("Unsupported endpoint");
}
// Override the global fetch so existing client code uses our adapter
globalThis.fetch = fetchRestLike;
// --- Existing client code (unchanged) ---
async function getUser(id) {
// Looks like a REST call, but is secretly routed through GraphQL
const res = await fetch(`/api/users/${id}`);
return res.json(); // works as before
}
// Client still calls getUser the same way, no migration required
getUser(1).then(console.log);
In the code above, the adapter is helping old code that expects a REST API keep working, even though the backend has been migrated to GraphQL (A newer way of building APIs where instead of calling fixed URLs, we send a query describing exactly the data we want.)
query {
user(id: 123) { id name email }
}
This avoids multiple REST calls and gives the client more control over what data it gets.
3. Adapter as a Compatibility Shield
Sometimes we may need to support old and new clients simultaneously. Breaking backward compatibility is often not an option. An adapter lets us present multiple shapes of the same data until clients catch up. As seen in the example below where old clients expect snake_case, new ones expect camelCase
// Core canonical model
const user = { id: 1, name: "Alice" };
// Adapter shields compatibility
function toSnakeCase(user) {
return { id: user.id, full_name: user.name };
}
function toCamelCase(user) {
return { id: user.id, name: user.name };
}
// --- API serving both client types ---
function getUser(format = "camel") {
if (format === "snake") return toSnakeCase(user);
return toCamelCase(user);
}
console.log(getUser("snake")); // { id: 1, full_name: "Alice" }
console.log(getUser("camel")); // { id: 1, name: "Alice" }
In the above code; instead of forcing all clients to immediately switch to a new data format, it provides a controlled adapter layer that serves both snake_case and camelCase models from the same canonical source. This means legacy consumers can continue functioning without modification, while modern consumers adopt the new convention. The real power here is that the core model remains stable, and all format-specific complexity is pushed to the adapter layer, minimizing ripple effects across the system and enabling smooth migrations without breaking existing integrations.
Also Sometimes compatibility isn’t just about data formats — it’s about paradigms. A common pain in JavaScript is migrating from legacy callback-based APIs to modern Promise/async flows. An adapter lets you mask the old contract while exposing a clean, future-proof interface, so teams can incrementally modernize without breaking all client code at once as shown below;
// Legacy: callback-based API (hard to compose / error-prone)
function getFile(path, cb) {
setTimeout(() => cb(null, "file contents"), 500);
}
// Adapter: wraps callback in a Promise (isolates legacy style)
function getFileAsync(path) {
return new Promise((resolve, reject) => {
getFile(path, (err, data) => {
if (err) reject(err); // Preserve error semantics
else resolve(data); // Modern async-friendly contract
});
});
}
// Usage: clean async/await code (no one cares callbacks existed)
(async () => {
const contents = await getFileAsync("/tmp/data.txt");
console.log(contents); // "file contents"
})();
4. What to avoid when dealing with adapters
- a. Leaky Adapters An adapter’s job is to completely hide the problems of the legacy or external API so the client can work with a clean, modern interface. If the client still has to know about legacy details (like awkward parameter units, old naming conventions, or weird error codes), then the adapter has “leaked” the complexity instead of isolating it.
// Leaky: clients still pass cents
class BadPaymentAdapter {
constructor(legacy) {
this.legacy = legacy;
}
pay(amountInCents) {
// Still expects clients to think in cents
this.legacy.makePayment(amountInCents);
}
}
- b. Too many tiny adapters One adapter layer is enough for key transformations
function normalizeKeys(obj, transformFn) {
return Object.fromEntries(
Object.entries(obj).map(([k, v]) => [transformFn(k), v])
);
}
The above code show a generic adapter for object key transformation, instead of writing many small, one-off adapters. The function normalizeKeys takes an object and a transformation function (transformFn) that defines how to change the keys (for example, camelCase to snake_case or uppercasing all keys). It uses Object.entries() to turn the object into [key, value] pairs, applies the transformation to each key, and then reconstructs a new object with Object.fromEntries(). This makes the adapter reusable and scalable, because you don’t need a new adapter every time your key format changes — you centralize the translation logic in a single, composable layer.
- c. badly written adapters could add overhead The code below highlights the performance tax of naive adapters. In the slowAdapter, every request does a full deep clone with JSON.parse(JSON.stringify(...)), which is expensive for large objects and unnecessary when you only need a subset of fields. In contrast, the fastAdapter, adapts only what matters (ID - id, FullName - name), avoiding wasted computation and memory churn. In high-throughput systems (finance, IoT, streaming), this distinction can be the difference between smooth performance and bottlenecks — so adapters should be surgical, not blunt instruments.
// Inefficient: deep clone on every request
function slowAdapter(data) {
return JSON.parse(JSON.stringify(data));
}
// Efficient: adapt only necessary fields
function fastAdapter(user) {
return { id: user.ID, name: user.FullName };
}
5. more tips on adapters
- Feature flag adapters We can switch to a feature based on a flag feature flag at runtime. In the code below we use a feature flag (FEATURE_GRAPHQL) from environment variables to control which adapter is active at runtime: if the flag is set to "true", getUser routes calls through the graphQLAdapter, otherwise it falls back to the old restAdapter. This pattern lets teams migrate incrementally—running both systems in parallel, testing the new path safely in production, and flipping the switch for all users only when confident—without changing the client code.
const useGraphQL = process.env.FEATURE_GRAPHQL === "true";
async function getUser(id) {
return useGraphQL ? graphQLAdapter(id) : restAdapter(id);
}
- test adapters in isolation Adapters should be tested in isolation. In the code below if the adapter is wrong, everything upstream breaks even if the legacy or new system works fine; by mocking the dependency (here makePayment) and asserting that $10 correctly becomes 1000 cents, we ensure the adapter’s contract is reliable, self-contained, and won’t silently propagate errors across the system during migrations or API changes.
test("PaymentAdapter converts $ to cents", () => {
const mock = { makePayment: jest.fn() };
const adapter = new PaymentAdapter(mock);
adapter.pay(10); // $10 - 1000 cents
expect(mock.makePayment).toHaveBeenCalledWith(1000);
});
- describe the adapter appropriately In the code below we explicitly state the adapter's purpose, ownership, and removal condition, it makes the team’s intentions clear, preventing future developers from mistaking it as core architecture, reducing the risk of forgotten technical debt, and ensuring accountability for cleaning it up once the REST clients are retired.
/**
* Temporary adapter during GraphQL migration.
* Owner: Payments Team
* Remove after REST clients are retired.
*/
class PaymentAdapter { /* ... */ }
INFO
Most migrations fail not because the new system is wrong, but because the bridge (adapter) is brittle. Build adapters like you’d build production APIs — they are your migration lifeline
javascript book
If this interested you, check out my Javascript Book