Educational Simulations • Cloud Development • AI Integration
From interactive educational simulations to enterprise-grade cloud solutions, we transform complex challenges into elegant digital experiences.
A comprehensive web-based business simulation platform serving higher education institutions worldwide. MISsimulation teaches Management Information Systems principles through an engaging political campaign simulation, providing students with hands-on experience in data-driven decision making, strategic planning, and business intelligence.
A robust, edge-deployed backend infrastructure built on Cloudflare Workers that powers multiple applications with shared authentication, data management, and service integrations. Designed for scalability, security, and global performance.
Real stories from our projects—the challenges we faced, decisions we made, and lessons we learned building educational platforms and cloud infrastructure.
When we first started working with MISsimulation, it was a monolithic .NET application that had been serving universities for years. It worked, but as the platform grew and expectations evolved, we knew we needed to modernize. The question wasn't if we should migrate to .NET Core, it was how to do it without breaking everything students and instructors depended on.
The migration to .NET Core brought immediate benefits: better performance, cross-platform support, and access to modern libraries. But we didn't stop there. We realized that not everything needed to live inside the main application. Features like AI-powered strategic validation, payment processing, and email automation could be extracted into serverless microservices.
That's when we made the decision to embrace Cloudflare Workers for our expanding feature set. Instead of embedding new functionality into the monolith, we built a "Common Core" platform that serves as a shared backend running at the edge for multiple applications. Need AI validation? Call the edge worker. Need to send emails? Hit the edge worker. Need payment processing? You get the idea.
The result? A hybrid architecture where the main simulation engine runs on ASP.NET Core (it's great at what it does), while new features deploy to the edge in seconds. We maintain one codebase for shared services, achieve global distribution automatically, and can iterate on new features without touching the core platform. Plus, those fast cold starts on Cloudflare Workers mean students don't even notice the AI validation happening in real-time.
Here's a challenge we didn't see coming: How do you create a believable island population of 30,000+ residents, complete with demographics, political leanings, social media profiles, income clustering, vehicle ownership patterns, mortgage data, and voting behaviors, all in under 30 seconds?
Our simulation generator needed to create realistic universes for each instructor's course. Not just random data, but coherent populations where wealthy urban clusters might lean liberal and drive Teslas, while rural conservative areas would have different patterns. We needed social media influencers with realistic follower counts, mortgage data that reflected income distributions, geographic clustering that made demographic sense, and dozens of other interconnected dimensions that create a believable society.
Initially, we built this on AWS Lambda. It worked fine for small populations, but as instructors requested larger islands with more complexity, we hit Lambda's limitations. Cold starts meant the first generation took forever. Regional deployment meant professors in Australia had to wait for data to cross the Pacific.
So we migrated to Cloudflare Workers. The catch? No file system. Our generator created CSV files and SQL dumps, but Workers run entirely in memory. We had to redesign the entire export system to build ZIP files in RAM, streaming data as we generated it. The generator runs at the edge, produces the complete dataset with all its interconnected dimensions, and the main .NET monolith picks it up and populates the MySQL database. Each resident gets their own unique set of traits, relationships, and behaviors.
The result: instructors can now generate custom simulation universes from anywhere in the world, often in under 30 seconds. When we output those statistics (population breakdowns, income distributions, influencer counts, geographic clustering, ownership rates), we can confidently say the data represents a realistic society with coherent patterns, not just random numbers in a database.
Student registration seems simple until you actually try to build it. A student enters their email, we send a confirmation code, they verify it, accept terms and conditions, make a payment, and get assigned to a team. Easy, right?
Wrong. What happens when they close the browser mid-registration? What if the payment succeeds but the network fails before we record it? What if an instructor needs to manually approve a student? What if a student needs to restart because they entered the wrong email? We needed a system that could handle all these edge cases without losing track of where each student was in the process.
We turned to Cloudflare Durable Objects with state machines. Each registration becomes a durable object, a single-threaded, strongly-consistent entity running at the edge. The state machine defines every possible state (EMAIL_ENTRY, EMAIL_VERIFICATION, TERMS_ACCEPTANCE, PAYMENT, etc.) and valid transitions between them. No matter what happens (browser closes, payment delays, manual interventions), the state machine knows exactly where the student is and what they can do next.
We built an admin dashboard where we can see every registration in flight, send events to move them through states, or even manually intervene when needed. A student stuck on payment? We can see their complete history. Email verification code expired? We can resend it. The system never loses state, never gets confused about what step comes next, and handles concurrent events gracefully.
The same pattern worked beautifully for professor applications. Multi-step forms, document uploads, approval workflows all managed by durable objects and state machines. It's one of those architectural decisions that feels over-engineered at first but saves you countless debugging sessions later.
Students in MISsimulation submit "Strategic Intent" documents before each turn. Basically, they explain their planned campaign actions and reasoning. Instructors need to validate these submissions to ensure students understand the simulation mechanics and are thinking strategically. But with 100+ students submitting every turn, manual grading became overwhelming.
We turned to AI, but this wasn't a simple situation. The AI needed to understand complex simulation rules, know what data sets students had purchased, remember what actions they took in previous turns, and provide constructive educational feedback, not just "looks good" or "needs work."
We built a multi-provider orchestration system. Why multiple providers? Cost, performance, and reliability. Different models excel at different tasks. Some are better at nuanced feedback but cost more. Others are economical for basic validation. The system automatically selects the right provider based on complexity and budget.
The real challenge was context management. We feed the AI the simulation rules, FAQ documents, the student's turn history, their team's current state, and the specific strategic submission (sometimes tens of thousands of tokens). We use retrieval-augmented generation (RAG) to pull in only relevant context, keeping costs manageable while maintaining accuracy.
Now students get instant, intelligent feedback. "Your planned ad campaign in the North district makes sense given your polling data, but have you considered that your opponent is likely to target the same area based on their previous strategy?" The AI doesn't just validate, it teaches. And instructors can focus on higher-level engagement instead of reading hundreds of strategy documents.
We had a growing ecosystem: the main simulation platform, admin tools, Zoho accounting system integration, Zoho CRM integration for customer management, and various internal dashboards. Each one started with its own authentication. Students logging in here, instructors logging in there, admins logging in everywhere. It was a mess.
Building a unified authentication system across this ecosystem became essential. We needed single sign-on where a student could access the simulation, check their payment history, and contact support all with one login. Same for instructors managing courses and viewing analytics.
We centralized authentication in the Common Core platform, implementing JWT tokens shared across all applications. A student logs in once, gets a token, and that token works everywhere. The Zoho processor worker automatically syncs customer data, so when someone registers for the simulation, they're added to our CRM with the right tags and workflows.
The accounting integration was particularly interesting. We needed to track not just payments, but correlate them with simulation usage, instructor accounts, and subscription renewals. The accounting system queries the same authentication service, so financial data is always tied to the correct user accounts. When an instructor's subscription expires, we can automatically restrict access across all systems.
From a developer perspective, it meant building shared packages in our monorepo with authentication middleware, user service clients, and shared interfaces. Now when we spin up a new internal tool, authentication is already handled. Import the package, configure the endpoints, done. It's one of those infrastructure investments that doesn't feel exciting but saves hours every week.
When you're building multiple interconnected applications, you face a choice: separate repositories with duplicated code, or a monorepo where everything lives together. We chose the monorepo, and it transformed how we work.
Our monorepo contains 15+ applications: Cloudflare Workers for various services, Angular and Svelte frontends, the simulation generator, admin tools, accounting systems, and shared packages used by everything. When we fix a bug in the authentication service or update an interface, every application gets the fix. When we add a new API endpoint to Common Core, the client automatically knows about it.
The shared packages are the real magic. We have packages defining types shared across frontend and backend. We have packages that handle templated emails from any service. We have packages that protect all our APIs. We have packages that provide the base classes for our state machine workflows.
When we build the registration form, it imports interfaces from shared packages and calls endpoints on Common Core with full type safety from the HTTP request body all the way through to the database query. The compiler tells us when we've broken something. We catch errors before deployment, not after users complain.
Is it more complex than separate repos? Initially, yes. Managing dependencies and build tooling across a monorepo takes work. But the productivity gains are enormous. We can refactor with confidence, share code without publishing packages, and maintain consistency across our entire platform. When you're a small team building a lot of interconnected pieces, the monorepo is a superpower.
Let's discuss how NorthernByte can help transform your ideas into reality.
Get Started