Supported Simulation Technology

ForgeFX Training Simulations is a team of experts with the tools required to produce high-quality enterprise grade training simulations. We use off-the-shelf middleware development tools and technology to produce our simulation-based training products. Leveraging industry-standard software development tools which are popular within the game and simulation development communities, we are able to offer our clients a strategic advantages when it comes to cost of production and ease of simulation deployment.

Game Engines

ForgeFX uses off-the-shelf middleware video game engine development tools to produce our products. We do not use proprietary, self-rolled, software for which finding additional developers can be difficult. ForgeFX uses industry-standard development tools which are popular within game and simulation development communities, allowing us to produce simulations for virtually any device or operating system. In addition to all of the advantages this presents to ForgeFX (large developer pool to draw from, large support community, pre-built components that save time, etc.) there are significant advantages to clients. Clients own all of the source code and assets that we create and there are no license fees or per-seat costs associated with our simulators.
Simulation Technology, Unity Game Engine Development Unity is a cross-platform game engine developed by Unity Technologies that is primarily used to develop video games and simulations for computers, consoles and mobile devices.
ForgeFX relies heavily on the Unity game engine to produce simulation-based training products for our clients, and has been featured online by Unity in their Made with Unity Showcase, exhibited at the Unity AR/VR Vision Summit, and have had products we've developed for our clients featured in the Unity3D Showcase.
Unreal Engine The Unreal Engine is a game engine developed by Epic Games, first showcased in the 1998 first-person shooter game Unreal. Although primarily developed for first-person shooters, it has been successfully used in a variety of other genres, including stealth, MMORPGs, and other RPGs.
With its code written in C++, the Unreal Engine features a high degree of portability and is a tool used by many game and simulation developers today.

Platforms

By leveraging game engines to develop and deploy our training simulators, ForgeFX is able to support dozens of hardware and operating system platforms. Embracing the 'author once, deploy anywhere' strategy, ForgeFX s able to guarantee that our clients software will run on any number of current and popular computer platforms, as well as ensure that it will continue to run on future platforms as well.
Microsoft Windows Windows is Microsoft's flagship operating system, and for many years was the de facto standard for all home and business computers. Microsoft Windows is the most popular end-user operating system, and all of our training simulators run on Microsoft Windows by default, as that's the platform we do the majority of our development on. Technically Windows is a metafamily of graphical operating systems. It consists of several families of operating systems, each of which cater to a certain sector of the computing industry with the operating system typically associated with IBM-based PC compatible architecture.
iOS iOS is a mobile operating system developed by Apple to run exclusively on its own hardware, like iPhones and iPads.
The iOS user interface and input system is based upon direct manipulation, using multi-touch gestures. Second only to Android, iOS is the most popular operating system in the world. If you'd like to use an iOS-based training simulator developed by ForgeFX, download the JLG Equipment Simulator today.
Android Android is a mobile operating system developed by Google, designed for touchscreen mobile devices such as smartphones and tablets.
Android's user interface is mainly based on direct manipulation, using touch gestures that loosely correspond to real-world actions, such as swiping, tapping and pinching, to manipulate on-screen objects, along with a virtual keyboard for text input. In addition to touchscreen devices, Google developed Android TV for televisions, Android Auto for cars, and Android Wear for wrist watches, each with a specialized user interface.
WebGL WebGL (Web Graphics Library) is a JavaScript-based API for rendering 3D graphics within any compatible web browser without the use of plug-ins. WebGL is integrated into all the web standards of the browser allowing for graphics processing unit (GPU) accelerated usage of physics and image processing. WebGL programs consist of control code written in JavaScript and shader code that is written in OpenGL Shading Language (GLSL), a language similar to the C++ programming language, and is executed on a computer's graphics processing unit (GPU).

Virtual Reality

Consumer-based virtual reality (VR) is a boom for the training simulator industry. Advances in technology have led to the proliferation of affordable VR devices and computers capable of running them. VR represents a huge evolutionary step forward in computer graphics rendering and user input methods. In a nutshell, we are able to do things in VR that we simply cannot do with traditional screen-based simulators. In addition to the feeling of immersion and presence that VR gives the user, it also includes elements like stereoscopic 3D which allows for a virtual sense of depth perception, and positional tracking of the users body allowing the application to know where they are in the virtual space.
VR-based training simulators are a game-changer and have produced a significant shift in the world of training simulators, by allowing users to be fully engaged with the training content in a way never before possible.
HTC Vive, Simulation Technology The HTC Vive is one of the most popular virtual reality headset on the market today. A head-mounted device that provides virtual reality for the wearer. VR headsets are widely used with computer games but they are also used in other applications, including simulators and trainers.
Developed by HTC and the Valve Corporation, the headset uses a technology called "room scale" tracking, that allows the user to move around in 3D space, much like they do the real world, and use motion-tracked handheld controllers to interact with the environment and objects within it. The Vive contains a gyrosensor, an accelerometer and a laser position sensor, which work together to track the position of your head. The HTC Vive as well as the Oculus Rift are excellent hardware choices when it comes to simulation technology.
Oculus RiftThe Oculus Rift is a VR headset developed and manufactured by Oculus VR, a division of Facebook. The Rift has a Pentile OLED display, 1080×1200 resolution per eye, a 90 Hz refresh rate, and 110° field of view. It has integrated headphones which provide a 3D audio effect, rotational and positional tracking. The positional tracking system, known as Constellation, is performed by a USB stationary infrared sensor that is picking up light that is emitted by IR LEDs that are integrated into the head-mounted display. The sensor normally sits on the user's desk. This creates 3D space, allowing for the user to use the Rift while sitting, standing, or walking around the same room
Samsung Gear VR The Samsung Gear VR is a mobile virtual reality headset developed by Samsung Electronics, in collaboration with Oculus, and manufactured by Samsung.
The Gear VR relies on a mobile phone as its CPU and GPU, rather than a PC, like the Rift and Vive. The Gear VR acts as the controller, but also integrates with third-party external controller devices. The Gear VR relies on an internal inertial measurement unit (IMU) for rotational tracking, but does not support positional tracking.

Augmented Reality

Augmented reality (AR) consists of a view of a physical, real-world environment whose elements are augmented by computer-generated graphical data. It is related to a more general concept called computer-mediated reality, in which a view of reality is modified by a computer. Whereas virtual reality replaces the user's view of the real-world with a simulated one, augmented reality enhances one’s current perception of reality with computer-generated content. Augmentation techniques are typically performed in real-time and in context with environmental elements, such as overlaying supplemental information over a live view of the real-world.
Microsoft HoloLens The Microsoft HoloLens is a self-contained, wearable holographic computer, that layers digital content on top of reality, providing augmented reality. Featuring a voice-controlled PC, the headset allows users to easily interact with, see and hear holograms that are displayed via high-definition lenses and spatial sound technology, contextually in the real-world. Providing or mixed, or augmented, reality-based simulations allows users to have shared virtual experiences together in the real-world, similar to the way they have real experiences, allowing us to create and simulate any environment or object for users to collectively experience and interact with.
Apple ARKit Apple's ARKit is a framework that allows developers to create augmented reality experiences for iPhones and iPads. By blending digital objects and information with the environment around you, ARKit takes iOS-based apps beyond the screen, enabling them to interact with, and be aware of, the real-world. With this connection to the real-world on a collection of levels, the devices become capable of overlaying digital computer-generated visualizations of data on top of real-world imagery, in real-time.
Google ARCore ARCore is a platform for building augmented reality apps on the Android platofrm. ARCore uses three key technologies to integrate virtual content with the real world as seen through your phone's camera.
Motion tracking, that allows the phone to understand and track its position relative to the world. Environmental understanding, which allows the phone to detect the size and location of flat horizontal surfaces like the ground or a coffee table. Light estimation, allowing the phone to estimate the environment's current lighting conditions.

Eye & Hand Tracking

Eye and hand tracking technologies allow users to interact with computers through hand, finger and eye motions. Users are able to interact with computer-generated virtual elements just like they do real-world physical objects. Rather than having to move a cursor on top of something to select it, eye tracking allows users to simply look at an object to select it. Similarly, hand tracking allows digital elements to be interacted just like physical elements are, through manipulation by fingers and hands.
LeapMotion The Leap Motion controller is a small device that uses two IR cameras and three infrared LEDs to observes a hemispherical area in front of the user. The Leap Motion software synthesizes the 3D position data of the user's hands so that they can be rendered in real-time in the virtual world, and the motions and actions of the user's real hands can be calculated, tracked and used as user input. The Leap Motion controller literally lets you "reach out and swipe, grab, pinch, or punch your way through the digital world".
Tobii Eye TrackingTobii's eye tracking technology includes a sensor that enables a device to know exactly where your eyes are focused. It determines your presence, attention, focus, drowsiness, consciousness or other mental states, and allows software to process and react to these states. Eye tracking is a technology that puts you in control of your device by using your eyes as you naturally would. A device or computer equipped with an eye tracker “knows” what a user is looking at. This makes it possible for users to interact with e.g. computers using their eyes.

Artificial Intelligence

Artificial Intelligence refers to computer systems that are able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages. Artificial intelligence is the simulation of human intelligence processes by computers.
IBM Watson IBM Watson is a question answering computer system capable of answering questions posed in natural language, developed in IBM's DeepQA project. Watson has access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage, including the full text of Wikipedia, which it processes against six million logic rules.
Named after IBM founder, Thomas J. Watson, Watson is an IBM supercomputer that combines artificial intelligence and sophisticated analytical software for optimal performance as a “question answering” machine. Applications for Watson's underlying cognitive computing technology are practically endless. Since the device can perform text mining and complex analytics on huge volumes of unstructured data, it can support a search engine or an expert/subject matter expert system with capabilities far superior to any current system.
Microsoft Azure Machine Learning Microsoft Azure Machine Learning is a tool that allows developers to build, test, and deploy predictive analytics solutions on your data. Machine learning is a data science technique that allows computers to use existing data to forecast future behaviors, outcomes, and trends. Using machine learning, computers learn without being explicitly programmed. Forecasts or predictions from machine learning can make apps and devices smarter. Azure provides software as a service (SaaS), platform as a service and infrastructure as a service and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems.
Google Cloud PlatformGoogle Cloud Machine Learning Engine is a managed service that enables developers to build machine learning models that work on any type of data. Providing a suite of cloud computing services that runs on the same infrastructure that Google uses internally for its end-user products, such as Google Search and YouTube.
With a set of management tools, it provides a set of modular cloud services including computing, data storage, data analytics and machine learning. The service is integrated with Google Cloud Dataflow for pre-processing, allowing you to access data from Google Cloud Storage, BigQuery, and others. Cloud Platform's portfolio of cloud services isn't as extensive as Amazon's or Microsoft Azure's, but it offers some specialized tools for developers that are hard to find elsewhere.

Devices

When producing training simulators, you want to be able to reach the widest possible audience in order to train the most people as possible. One of the best ways to ensure this wide distribution is to deploy your application on as many devices and platforms as you can. ForgeFX produces simulators for just about every device, from desktops and laptops, to mobile devices, to wearable AR and VR devices. We author content once and deploy applications on the widest possible array of devices.
Desktop Computer-Based Simulator Developing simulators that run on desktop computers is our default deployment, and targeted end-user, platform. We do all of our application development on these devices, so by default all of our simulators run on desktop machines, specifically Windows, Mac and Linux machines.
Desktop computers offer the most horsepower and accessibility for integrating real-world original equipment manufacturer (OEM) controls, as well as additional peripherals like VR/AR devices. Gone are the days of training simulators requiring racks of computers to function. Today's off-the-shelf desktop computers, that are intended for video game use, are more than capable of running our high-quality graphics simulation-based trainers, delivering highly-immersive and engaging virtual experiences.
Laptop-Based Training Simulators Second only to desktop computer-based simulators, laptop-based simulations are the most popular way of deploying training simulation software to users. The more portable a training simulation's hardware platform is, the more likely it will reach a greater number of trainees. Today's laptop computers are more powerful than ever, with high-performance graphics cards and processors, built for video gamers, but also well-suited for interactive training simulators.
Laptop-based training simulators are capable of being connected to real-world equipment controls and additional peripherals, just like desktop computer-based simulators, but also have the option of being taken into the field or classroom setting, to provide just-in-time training.
Tablet-Based Training Simulators When it comes to deploying portable training simulators, tablet-based simulators lead the pack. Tablet-based computers provide for inexpensive, lightweight, and highly-customize simulation solutions. Since there are no physical buttons to integrate with the application, all functionality can and must be simulated virtually by software. This allows tablet-based simulators to easily simulate any piece of equipment, and switch from one, to another, to another in a seamless fashion allowing for easily accessible training.
Tablet-based training simulators can provide beginner and intermediate level scenarios to help improve operational skills, as well as a controls familiarization where students may practice skills as directed or of their own choosing.
Mobile Phone Based Training Simulators There may be no more technology device that has changed our society more than the mobile phone. Today's mobile phones are nothing short of pocket-sized, internet-connected, super-computers. While the screens may be small and the processing power limited, with more than a billion of these devices in the world, capable of downloading and running highly-effective simulation-based training content, the smart phone is an excellent platform to deploy training simulators on.
Manufacturers like Apple and Google put a tremendous effort towards getting their phones into the hands of million of people every year. By deploying your training simulation software on these popular devices, you are guaranteeing that your content will be capable of reaching the widest possible audience. Applications deployed on mobile devices are perfect for micro-learning applications, where users can download a specific procedure or scenario in real-time as they need it.
Simulation Technology: VR/AR Simulation-Based Training Simulators Virtual Reality (VR) and Augmented Reality (AR) devices are the latest and greatest simulation technology advances to grace the training simulation world. The past few years has seen the release of consumer-off-the-shelf (COTS) VR and AR devices that allow users to become fully immersed in content in a way traditional screen-based content ever could. VR-based training simulators place the user in the middle of the virtual world, where they are free to look and move around, and interact with the virtual world similar to the way they interact with the real-world. AR devices enable users to augmented their view of the real-world with interactive digital content, and share this view with others who are in the same room with them, or on the other end of the world.

Developing Your Project

Regardless of what technology you're looking to support, we can develop a custom training simulation application for you, to run on any device or platform. We encourage you to contact ForgeFX to discuss how we can create the perfect simulation-based training solution for your organization. We'll talk with you about your specific requirements, and based on these we'll work with you to arrive at a detailed proposal to develop and deliver your project. If you need a ballpark price, we can often provide that during initial phone conversations. If you require a more precise estimate and have a detailed project specification in hand, we can typically turn around a firm bid rapidly.
Contact Us Now