The most essential part of a successful digital twin implementation is a solid infrastructure that supports your models and data streams while retaining the ability to adapt as the needs of your stakeholders change. During the construction of such a solution, you’ll be faced with many critical junctures where your choices will drastically impact the timeliness and effectiveness of your delivery. Should you use a high fidelity model or a low fidelity model? Should your models be evaluated eagerly or lazily? Should your model be evaluated at the edge or in the cloud or both? In this talk, you’ll learn how to make these decisions based on your project needs. From the collection of your lowest level sensor reading to the automated decision engine that should be leveraging your digital twin models, you will face similar questions and choices. In order for your project to maintain traction, you must understand how each choice affects not only your twin’s alignment with the real world but also how it meets your infrastructure needs. Digital twins could revolutionize the way companies make decisions. However, the scaling and infrastructure complexities that are core to the idea of a digital twin might bring your pursuit of a solution to a grinding halt. This talk will address key concepts borne out of Computer Science and Agile Development that, once understood, can keep your team moving forward, rather than stalling out.