Supported Simulation Technology
Unity is a cross-platform game engine developed by Unity Technologies that is primarily used to develop video games and simulations for computers, consoles and mobile devices.
ForgeFX relies heavily on the Unity game engine to produce simulation-based training products for our clients, and has been featured online by Unity in their Made with Unity Showcase, exhibited at the Unity AR/VR Vision Summit, and have had products we’ve developed for our clients featured in the Unity3D Showcase.
The Unreal Engine is a game engine developed by Epic Games, first showcased in the 1998 first-person shooter game Unreal. Although primarily developed for first-person shooters, it has been successfully used in a variety of other genres, including stealth, MMORPGs, and other RPGs. With its code written in C++, the Unreal Engine features a high degree of portability and is a tool used by many game and simulation developers today.
By leveraging game engines to develop and deploy our training simulators, ForgeFX is able to support dozens of hardware and operating system platforms. Embracing the ‘author once, deploy anywhere’ strategy, ForgeFX s able to guarantee that our clients software will run on any number of current and popular computer platforms, as well as ensure that it will continue to run on future platforms as well.
Windows is Microsoft’s flagship operating system, and for many years was the de facto standard for all home and business computers. Microsoft Windows is the most popular end-user operating system, and all of our training simulators run on Microsoft Windows by default, as that’s the platform we do the majority of our development on.
Technically Windows is a metafamily of graphical operating systems. It consists of several families of operating systems, each of which cater to a certain sector of the computing industry with the operating system typically associated with IBM-based PC compatible architecture.
iOS is a mobile operating system developed by Apple to run exclusively on its own hardware, like iPhones and iPads. The iOS user interface and input system is based upon direct manipulation, using multi-touch gestures.
Second only to Android, iOS is the most popular operating system in the world. If you’d like to use an iOS-based training simulator developed by ForgeFX, download the JLG Equipment Simulator today.
Android is a mobile operating system developed by Google, designed for touchscreen mobile devices such as smartphones and tablets.
Android’s user interface is mainly based on direct manipulation, using touch gestures that loosely correspond to real-world actions, such as swiping, tapping and pinching, to manipulate on-screen objects, along with a virtual keyboard for text input. In addition to touchscreen devices, Google developed Android TV for televisions, Android Auto for cars, and Android Wear for wrist watches, each with a specialized user interface.
The HTC Vive is one of the most popular virtual reality headset on the market today. A head-mounted device that provides virtual reality for the wearer. VR headsets are widely used with computer games but they are also used in other applications, including simulators and trainers.Developed by HTC and the Valve Corporation, the headset uses a technology called “room scale” tracking, that allows the user to move around in 3D space, much like they do the real world, and use motion-tracked handheld controllers to interact with the environment and objects within it.
The Vive contains a gyrosensor, an accelerometer and a laser position sensor, which work together to track the position of your head. The HTC Vive as well as the Oculus Rift are excellent hardware choices when it comes to simulation technology.
The Samsung Gear VR is a mobile virtual reality headset developed by Samsung Electronics, in collaboration with Oculus, and manufactured by Samsung. The Gear VR relies on a mobile phone as its CPU and GPU, rather than a PC, like the Rift and Vive. The Gear VR acts as the controller, but also integrates with third-party external controller devices. The Gear VR relies on an internal inertial measurement unit (IMU) for rotational tracking, but does not support positional tracking.
The Microsoft HoloLens is a self-contained, wearable holographic computer, that layers digital content on top of reality, providing augmented reality. Featuring a voice-controlled PC, the headset allows users to easily interact with, see and hear holograms that are displayed via high-definition lenses and spatial sound technology, contextually in the real-world. Providing or mixed, or augmented, reality-based simulations allows users to have shared virtual experiences together in the real-world, similar to the way they have real experiences, allowing us to create and simulate any environment or object for users to collectively experience and interact with.
Apple’s ARKit is a framework that allows developers to create augmented reality experiences for iPhones and iPads. By blending digital objects and information with the environment around you, ARKit takes iOS-based apps beyond the screen, enabling them to interact with, and be aware of, the real-world. With this connection to the real-world on a collection of levels, the devices become capable of overlaying digital computer-generated visualizations of data on top of real-world imagery, in real-time.
ARCore is a platform for building augmented reality apps on the Android platofrm. ARCore uses three key technologies to integrate virtual content with the real world as seen through your phone’s camera.Motion tracking, that allows the phone to understand and track its position relative to the world. Environmental understanding, which allows the phone to detect the size and location of flat horizontal surfaces like the ground or a coffee table. Light estimation, allowing the phone to estimate the environment’s current lighting conditions.
Eye & Hand Tracking
The Leap Motion controller is a small device that uses two IR cameras and three infrared LEDs to observes a hemispherical area in front of the user. The Leap Motion software synthesizes the 3D position data of the user’s hands so that they can be rendered in real-time in the virtual world, and the motions and actions of the user’s real hands can be calculated, tracked and used as user input. The Leap Motion controller literally lets you “reach out and swipe, grab, pinch, or punch your way through the digital world”.
A device or computer equipped with an eye tracker “knows” what a user is looking at. This makes it possible for users to interact with e.g. computers using their eyes.
IBM Watson is a question answering computer system capable of answering questions posed in natural language, developed in IBM’s DeepQA project. Watson has access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage, including the full text of Wikipedia, which it processes against six million logic rules.Named after IBM founder, Thomas J. Watson, Watson is an IBM supercomputer that combines artificial intelligence and sophisticated analytical software for optimal performance as a “question answering” machine. Applications for Watson’s underlying cognitive computing technology are practically endless. Since the device can perform text mining and complex analytics on huge volumes of unstructured data, it can support a search engine or an expert/subject matter expert system with capabilities far superior to any current system.
Microsoft Azure Machine Learning is a tool that allows developers to build, test, and deploy predictive analytics solutions on your data. Machine learning is a data science technique that allows computers to use existing data to forecast future behaviors, outcomes, and trends. Using machine learning, computers learn without being explicitly programmed. Forecasts or predictions from machine learning can make apps and devices smarter. Azure provides software as a service (SaaS), platform as a service and infrastructure as a service and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems.
When producing training simulators, you want to be able to reach the widest possible audience in order to train the most people as possible. One of the best ways to ensure this wide distribution is to deploy your application on as many devices and platforms as you can. ForgeFX produces simulators for just about every device, from desktops and laptops, to mobile devices, to wearable AR and VR devices. We author content once and deploy applications on the widest possible array of devices.
Developing simulators that run on desktop computers is our default deployment, and targeted end-user, platform. We do all of our application development on these devices, so by default all of our simulators run on desktop machines, specifically Windows, Mac and Linux machines.
Desktop computers offer the most horsepower and accessibility for integrating real-world original equipment manufacturer (OEM) controls, as well as additional peripherals like VR/AR devices. Gone are the days of training simulators requiring racks of computers to function. Today’s off-the-shelf desktop computers, that are intended for video game use, are more than capable of running our high-quality graphics simulation-based trainers, delivering highly-immersive and engaging virtual experiences.
Second only to desktop computer-based simulators, laptop-based simulations are the most popular way of deploying training simulation software to users. The more portable a training simulation’s hardware platform is, the more likely it will reach a greater number of trainees. Today’s laptop computers are more powerful than ever, with high-performance graphics cards and processors, built for video gamers, but also well-suited for interactive training simulators.Laptop-based training simulators are capable of being connected to real-world equipment controls and additional peripherals, just like desktop computer-based simulators, but also have the option of being taken into the field or classroom setting, to provide just-in-time training.
When it comes to deploying portable training simulators, tablet-based simulators lead the pack. Tablet-based computers provide for inexpensive, lightweight, and highly-customize simulation solutions. Since there are no physical buttons to integrate with the application, all functionality can and must be simulated virtually by software. This allows tablet-based simulators to easily simulate any piece of equipment, and switch from one, to another, to another in a seamless fashion allowing for easily accessible training.Tablet-based training simulators can provide beginner and intermediate level scenarios to help improve operational skills, as well as a controls familiarization where students may practice skills as directed or of their own choosing.
There may be no more technology device that has changed our society more than the mobile phone. Today’s mobile phones are nothing short of pocket-sized, internet-connected, super-computers. While the screens may be small and the processing power limited, with more than a billion of these devices in the world, capable of downloading and running highly-effective simulation-based training content, the smart phone is an excellent platform to deploy training simulators on.Manufacturers like Apple and Google put a tremendous effort towards getting their phones into the hands of million of people every year. By deploying your training simulation software on these popular devices, you are guaranteeing that your content will be capable of reaching the widest possible audience. Applications deployed on mobile devices are perfect for micro-learning applications, where users can download a specific procedure or scenario in real-time as they need it.
Virtual Reality (VR) and Augmented Reality (AR) devices are the latest and greatest simulation technology advances to grace the training simulation world. The past few years has seen the release of consumer-off-the-shelf (COTS) VR and AR devices that allow users to become fully immersed in content in a way traditional screen-based content ever could. VR-based training simulators place the user in the middle of the virtual world, where they are free to look and move around, and interact with the virtual world similar to the way they interact with the real-world. AR devices enable users to augmented their view of the real-world with interactive digital content, and share this view with others who are in the same room with them, or on the other end of the world.
Developing Your Project
Regardless of what technology you’re looking to support, we can develop a custom training simulation application for you, to run on any device or platform. We encourage you to contact ForgeFX to discuss how we can create the perfect simulation-based training solution for your organization. We’ll talk with you about your specific requirements, and based on these we’ll work with you to arrive at a detailed proposal to develop and deliver your project. If you need a ballpark price, we can often provide that during initial phone conversations. If you require a more precise estimate and have a detailed project specification in hand, we can typically turn around a firm bid rapidly.