BoredGamer

Star Citizen, Squadron 42 & Theatres of War News, Guides, Videos & Gameplay by BoredGamer

Star Citizen BIG AI & NPC Updates – What Are CIG Working On Now?

Star Citizen & Squadron 42 share a load of updates that are being worked on for their AI & NPC Tech… and a huge amount of that work is starting to bear dividends. I want to summarise that AI & NPC info from the monthly reports here in a dedicated video, going over what CI have been focused on over the last few weeks and what their focus is now… let’s jump in:

The AI Content team progressed with the vendor and patron behaviours, realising the full flow of NPCs ordering, carrying, and consuming food and drink. This gives them a sense of life and purpose, as they’re always busy doing something.

Progress continued on the utility behaviour, with AI characters opening crates, taking items out, and putting items in. This was a surprisingly complex task but the various elements are now working well together.

Other tasks involved polishing the quartermaster behaviour, making animation pose updates, setting up wildlines, overhauling documentation, and prototyping food eating.

They also began working in chapter 15, which is a run-down location with a different feel from other areas.

The AI animators continued to refine Vanduul execution animations to ensure they’re perceived as strong and impactful from the first-person perspective, are better suited to be played “on the spot” if the attack occurs in a cramped space, are more believable to be deflectable by the player, and allow players to escape or attack once deflected.

On the perception system side, the AI team implemented functionality to allow them to rate and respond to ‘disruptions,’ which are unexpected scenarios that can be resolved. For example, the lights of a room turning off or an engine being disabled. Players can use this mechanic to try to disrupt NPC behaviors when they haven’t been detected, giving them the opportunity to better infiltrate the level by avoiding patrols and sentries.

Players can also use ‘distractions’ ­(actions that trigger audio stimuli, such as throwing an object) to cause NPCs to leave their positions to investigate.

“We had to ensure that these events are propagated out to all of the agents within the appropriate area, that they are handled by the perception system, rated against the current threat level, and that the correct behaviors are implemented to resolve the disruptions. The resolution may involve agents going to investigate the source of the disruption, or alternatively, if too many disruptions have occurred, to increase their alertness level. This means that they may take out their weapons, say specific alertness-level wildlines, or respond quicker to subsequent events. We also had to take into consideration that the immediate source of the disruption may not be what the NPC wants to investigate. For example, the agent may perceive that a light goes out, but they would want to investigate the light switch that controls the light, and ultimately, if they don’t encounter the player, turn the light back on again.” AI Features Team

October also saw AI implement rocket launcher and railgun handling. For rocket launchers, NPCs have to consider if the shot will cause explosive splash damage to friendly agents. This check can be disabled using the friendly-fire trait to enable more reckless enemies. For the railgun, they changed how the behaviour works in cover. Now, NPCs will start charging the weapon while in cover before emerging to fire. For both of these weapons, the team disabled the standard check to see if the weapon can hit the target, so enemies will continue to fire at the target even when in cover. This allows the player to see that these weapons are being fired nearby and respond accordingly rather than being immediately hit upon leaving cover. As part of this work, they also reviewed the existing shouldered weapon animations.

Continuing with weapon usage, AI Features implemented how NPCs pick up weapons and ammunition during combat. Enemies are under the same ammunition constraints as the player, requiring them to conserve it and use other weapons when they run out. NPCs will evaluate the best course of action in this situation, including switching to their secondary weapon, picking up ammunition, or finding a new weapon altogether. When looking for ammunition, they could take it from boxes, out of loose weapons, or from stowed weapons on dead or incapacitated characters. When looking for weapons, they could get them from racks, find them around the area, or take them from other characters. To implement this, weapons were set up as usables, which allows the AI to search for the best usable that can provide ammo or a weapon. The team had to modify the usable system slightly, including adding support for usables that supply themselves (loose weapons or ammo) and ensuring that usable alignment positions are correctly aligned with gravity when the weapon or ammo falls.

For non-combat AI, the team worked on functionality to support NPCs moving cargo from one location to another, including picking up crates, stacking them on a trolley, moving the trolley to a destination (such as a cargo hold), and then taking them off the trolley and re-stacking them. An existing prototype was also adapted with new tech to provide behaviours that can be used in varying locations to bring them to life.

They also refined their tools to generate better quality cover. This included making cover generation more robust to edge cases and supporting recognition of cover over a certain distance from identified ‘cover edges.’ This allows cover to be generated in a greater range of areas.

Finally, AI Features took the first steps towards implementing ‘buddy AI’ that will support gameplay where the player is accompanied by one or more characters. To begin with, they made a rapid prototype featuring some of the functionality that might be required to generate conversations about what the final form of this would be and how it will fit into gameplay.

The AI Tech team made several usability improvements to the Apollo Subsumption tool. This included updating the multigraph with feature folders and functions to allow organisation from the outline view. For example, the designers are now able to delete and move graphs to different folders directly from the outline view panel. They can also drag a node connection into an empty section of the graph to open the drop-down menu that allows them to create a new task. 

Minor usability improvements were made too, including allowing all the entries in the outline view to collapse and the addition of a new filter field in the variable tree view.

They also extended the graph view with a new custom view style. This is currently used to allow Apollo to open, visualise, and edit the Mastegraph files that were previously managed by hand. 

For perception, the team worked on supporting audio noise level and audio masking: Audio noise level ensures that audio stimuli that don’t surpass the basic background noise won’t be heard by listeners. New tools were implemented to allow the designers to configure the audio noise level in an area and the type of sounds that should be considered “normal” in that area.

Audio masking makes sure that particular audio stimuli are not perceived as anomalous. For example, explosions in a field where fireworks are tested. This allows the player to use the environment as audio cover when performing actions. For example, firing weapons near a shooting range as to not alert NPCs. 

The team implemented light extensions to action areas. These enable the designers to link light groups to an area and automatically calculate whether the state of the environment should be light or dark. This impacts the time NPCs require to perceive a target, making it easier to stay hidden in darker areas.

AI Tech also extended the system that allows the visibility of large objects, such as vehicles.

AI Tech Team

“Large vehicles can be seen at further distances but we need to avoid approaches that cost too much on the CPU, such as doing raycasts for very long distances. To prevent this, we can now describe the visibility of a large object depending on the degree of visual angle – basically at which distance it has a specific size from the observer’s perspective.” 

This also allows the team to defer the discovery phase of which objects are within range of large objects. This utilizes one query, which then allows each individual to correctly verify the occlusion of the object.

The parameters of the visual field-of-view were also extended to be overridable based on the AI state. For example, allowing pilots to see at a greater distance and visually target on-foot characters.

The team continued extending navigation links to support two main functionalities: Firstly, they enabled usable routing to function with entity links to allow door panels (that are linked to a door) to be correctly used by NPCs. This will allow them to understand which panel they should use to trigger the right interaction.

Secondly, they exposed a way to define the costs of nav-links based on specific conditions, such as increasing the cost for vaulting when a character isn’t alerted or walking. Those costs allow the system to describe how preferable actions are from the NPC’s perspective. For example, a character wouldn’t vault, which is a physically demanding action, if not needed.

Work began to properly allow NPCs to drive ground vehicles. The team are currently verifying the setup of a selection of vehicles to use as reference. They also set up the code to switch to the right nav-mesh when driving a vehicle and select the right path-follower.

Finally for AI Tech, the team worked on the locomotion refactor, which involved integrating current work into different streams to begin stabilising bugs. They also experimented with foot locking during MoveNet and are currently verifying how to use this functionality on various aspects of locomotion, such as during sharp turns. They also implemented a first frame adjustment for the seamless transition code to improve the entering of usables. This will also play a specific animation at a specific location with a specific pose.

AI Vehicles worked with Flight Design to deliver a turret battle that occurs early in the campaign. This involved reworking wave 0 that will be used as the standard for the rest of the waves during the whole chapter. This includes a new behaviour that consists of Vanduul ships attacking the Javelin turret by picking stunt splines placed by the designers.

In addition, they completed various support tasks for the designers working in different flight-based chapters.