FireStart GmbH
Modernize your legacy web applications using Web Components
Description
Andreas Hässler von FireStart berichtet in seinem devjobs.at TechTalk über den Weg, wie das Team ihre legacy Applikation mithilfe von Web Components umgebaut hat – und sie sich dabei gleich mehrere Monate an Arbeit erspart haben.
By playing the video, you agree to data transfer to YouTube and acknowledge the privacy policy.
Video Summary
In Modernize your legacy web applications using Web Components, Andreas Hässler shows how a complex, jQuery-heavy model renderer was wrapped as a Web Component to run inside a new Angular portal without duplicating code. He walks through discarded options (copying code, an iframe with heavy load times) and the implementation steps: Webpack bundling with ES6 imports and TypeScript, integrating third‑party libraries, creating a custom element with Shadow DOM and a data setter, and publishing it as an npm library used by both old and new apps. Viewers gain a pragmatic pattern to modernize legacy UIs incrementally—use Web Components for encapsulation, reuse, and faster delivery.
Modernizing Legacy Web Apps with Web Components: A Practical Walkthrough from “Modernize your legacy web applications using Web Components” by Andreas Hässler (FireStart GmbH)
Why this talk matters: Mobile demand meets legacy complexity
At DevJobs.at, we keep an eye on pragmatic modernization stories—the ones that turn tight deadlines and complex codebases into focused engineering moves. In “Modernize your legacy web applications using Web Components,” Andreas Hässler (FireStart GmbH) offered exactly that: a disciplined, standards‑based path to bring a mission‑critical legacy renderer into a modern Angular application—without a costly rewrite.
The scenario was familiar and urgent. FireStart customers model business processes in a Windows application and view them via a web‑based Process Portal. Data showed more users accessing these models from mobile devices, yet the legacy web application wasn’t designed for that. On top, the tech stack made feature work and changes hard. The team decided to rebuild the portal with a modern stack (Angular). Progress was solid—except for one missing piece: the core model viewer.
“Reimplementing this from scratch wasn’t really feasible … it would have taken us months and would still not cover any edge cases.”
The renderer was complex, with many process elements and years of logic ensuring consistent modeling and rendering across the Windows and web applications. The deadline was closing in. The ask: ship a preview soon, avoid duplicated maintenance, and keep behavior consistent.
The problem space: Complexity, consistency, and parallel support
The model renderer sat at the heart of the experience: diverse process models, numerous elements, and refined behavior formed over years to handle tricky edge cases. Any replacement had to preserve that behavior across platforms. Meanwhile, the new Angular portal needed to move forward. And for a time, the old app needed to remain available—parallel support was non‑negotiable.
This combination—complex logic, strong consistency requirements, a looming deadline, and parallel operation—ruled out brute‑force rewrites and risky shortcuts.
Two tempting options—and why they failed the test
Hässler walked through the obvious candidates first, then ruled them out based on practical trade‑offs.
1) Copy the old code into the new portal
This would have minimized effort upfront, but it broke down under parallel support: any ongoing changes would have to be implemented twice. That risked divergence and increased engineering load precisely when the team needed focus.
2) Render the old portal inside an iframe
This had a clever twist: a query parameter could hide old UI chrome and display only the renderer. In practice, though, tests showed heavy load times—the new application and the iframe both loaded the same process elements.
“The load time was getting really high because the whole process elements needed to be loaded two times.”
Technically feasible, but not user‑friendly—especially under mobile constraints.
The pivot: Web Components as a clean integration boundary
The team turned to standard Web APIs: Web Components. The appeal here is straightforward: you can package UI into framework‑agnostic components and consume them from any stack—jQuery, React, Angular—without rewriting the internals. The plan: wrap the legacy model renderer as a Custom Element, use Shadow DOM to encapsulate it, and feed data from the new portal via a setter.
A reference mentioned in the session: https://www.webcomponents.eu. Web Components combine APIs like Custom Elements and Shadow DOM, providing precise encapsulation of markup, behavior, and styles.
Architecture at a glance: From legacy code to a reusable library
To make the renderer work as a drop‑in component, the team needed to change how it was built and distributed:
1) Create a bundlable codebase. The legacy app loaded scripts directly via index.html. The team migrated to explicit ES6 imports so a modern bundler could produce a reusable library.
2) Bring third‑party libraries under control. The renderer relied on jQuery and ConvertJS. Integrating jQuery and its plugins with Webpack is tricky; the team spent time iterating on configurations to make everything load correctly.
3) Define the Web Component. A Custom Element creates a Shadow Root and initializes the legacy renderer inside it. A setter accepts data from the new application. Once defined, the element can be used like any other HTML tag.
4) Encapsulate styles and logic. Shadow DOM prevents the outer application’s styles from leaking into the component. In the talk, Hässler showed how an “open” shadow root makes the internals visible in DevTools, whereas “closed” behaves like a native browser component.
The end result: an internal NPM library exposing a Web Component that both the new Angular portal and the legacy application can consume—one source of truth for rendering behavior and edge cases.
Step 1: Bundling—migrating from ad‑hoc scripts to explicit ES6 imports
The old application had a bundling mechanism, but not one that could produce a reusable library for other projects. Switching to Webpack and adding import statements across files was the first major step.
- Script tags gave way to ES6 imports; dependencies became explicit instead of relying on load order.
- TypeScript support was introduced, improving safety for future refactoring of the renderer.
“We had to switch to something more modern like Webpack and actually add all the different import statements … so we could then bundle it inside a library.”
This step may sound procedural, but it’s foundational: once the renderer can be bundled, versioned, and consumed as a package, it becomes a real product the rest of the company can depend on.
Step 2: Third‑party libraries—taming jQuery and ConvertJS with Webpack
Next came the thorny part: wiring up jQuery and plugins in a Webpack world. Hässler didn’t dwell on configuration details—he explicitly avoided turning the talk into a config deep dive—but the core point was clear: expect trial and error when marrying older jQuery‑based code with modern bundling.
“For those of you who have tried to use jQuery together with Webpack, you will know that it’s kind of a pain.”
If you budget for that integration work, the payoff is a bundle that works predictably across environments.
Step 3: The Web Component—Custom Element plus Shadow DOM
With bundling solved and third‑party libraries loading, the team built the Custom Element:
- The element class initializes a Shadow Root.
- The legacy model renderer is mounted inside that Shadow DOM.
- A setter provides content/data from the outside (the new portal).
- The element is registered so the browser recognizes it; usage then resembles any standard HTML tag.
Shadow DOM is what seals the deal: no style leakage from the host application, and the component’s internals remain consistent. In open mode, developers can inspect the Shadow DOM in DevTools; in closed mode, it behaves like a native element—opaque and self‑contained.
“Shadow DOM encapsulates this, so your styling from the outside application cannot leak into the Web Component.”
What shipped: A reusable library, a faster path to value, and consistent behavior
The outcome came quickly:
- The renderer was bundled as a reusable internal NPM package, including jQuery and the required plugins.
- The Web Component encapsulated the renderer and its styles, and it could be used inside the next‑gen Angular portal without changing the underlying rendering logic.
- Crucially, the same package could still be used in the legacy application, maintaining a single source of truth for rendering behavior.
- Time to value improved dramatically: a few weeks to get started and integrate, compared to months for a risky reimplementation that might still miss edge cases.
“It took us just a couple of weeks to get started instead of the months that it would have taken to rebuild this from scratch.”
The team doubled down on the pattern: they applied the same Web Component approach to their dashboards component and saved time again, ultimately shipping the next‑gen application in time for release.
Why Web Components were the right tool for this job
From our vantage point, the choice fits the constraints perfectly:
- The renderer was battle‑tested and full of edge cases—a prime candidate for encapsulation rather than re‑creation.
- The new shell (Angular) could consume Web Components cleanly, without invasive glue code.
- Parallel operation demanded a shared implementation. An internal NPM package provided exactly that.
- iframe drawbacks—duplicated loading, slow performance, UX friction—were avoided.
Practical guidance for teams facing similar constraints
The session distilled a set of steps and guardrails that translate well to other organizations.
1) Identify the right candidates
- Which legacy modules are both high‑value and high‑risk to rewrite because of accumulated edge cases?
- Where is behavioral parity across old and new applications non‑negotiable?
2) Define a slim external interface
- Determine the minimal data your component needs from the host app; implement a clear setter or similar entry point.
- Decide what events or callbacks need to bubble up.
3) Prepare the codebase for packaging
- Replace script‑tag loading with ES6 imports; make dependency edges explicit.
- Use a modern bundler like Webpack to emit a library build.
- Add TypeScript support if possible to de‑risk future refactoring.
4) Tackle third‑party dependencies early
- Expect friction with jQuery and plugins; iterate on Webpack configs until everything loads correctly.
- Build a minimal harness to validate loading and runtime behavior before integrating into the main app.
5) Use Shadow DOM deliberately
- Open shadow roots aid debugging while you stabilize the component.
- Closed mode increases encapsulation once you’re confident in its behavior.
6) Optimize for performance and UX
- Avoid iframes if they cause duplicated loading.
- Keep your bundle lean; only include necessary dependencies.
7) Plan for parallel consumption
- Use the same package in the legacy and the new app to preserve a single source of truth.
- Version and release the package carefully to prevent divergence.
Memorable implementation details
- The transformation from scattered script tags to explicit imports was more than cosmetic—it unlocked library packaging and reuse.
- The frank acknowledgment that jQuery + Webpack is “a pain” set the right expectations and focused effort where it mattered.
- The Shadow DOM open vs. closed explanation offered a practical debugging tip that many teams can adopt.
- Reusing the library inside the legacy application reinforced consistency and reduced maintenance risk.
Demo highlights: Encapsulation you can see
Hässler showed how Shadow DOM encapsulation keeps host styles from affecting the component. With an open shadow root, DevTools can reveal the component internals; switch to closed, and it becomes opaque like a native element. For teams in transition, starting with open mode is a pragmatic compromise between observability and isolation.
When to adopt this approach
Based on the talk, the approach is a strong fit when you have:
- a stable yet complex legacy UI module,
- a new host application (e.g., Angular) under development,
- a requirement to run legacy and new apps in parallel,
- and a deadline that rewards reuse over reinvention.
The framework‑agnostic nature of Web Components also future‑proofs the investment: even if the host framework changes later, the component remains portable.
Conclusion: A standard‑based, low‑risk migration path that delivers
“Modernize your legacy web applications using Web Components” by Andreas Hässler (FireStart GmbH) showcases a crisp lesson: when a complex, edge‑case‑rich UI module must live on in a modern application, Web Components provide a clean boundary and a fast track. By establishing a bundlable codebase, wrangling third‑party dependencies, and leveraging Shadow DOM for encapsulation, FireStart shipped sooner, kept behavior consistent, and avoided the pitfalls of duplication and iframes.
In Hässler’s words, the approach “saved us a lot of time.” The team applied it beyond the renderer—to dashboards as well—and delivered the next‑gen application on schedule. For engineering teams with similar constraints, this talk reads like a practical playbook you can adapt with confidence.