Phil Fuster/Photo: Business Wire
In today’s ever-changing federal landscape, technologies like artificial intelligence and 5G are the future and are already unlocking new opportunities and capabilities across a wide range of federal missions. In his Spotlight interview, this executive said: Hitachi Vantara Federal chief growth officer phil faster We’ve explained which technologies, trends and changes are dominating the public sector today and why.
Read below for Phil Fuster’s Executive Spotlight interview.
Which emerging technologies are expected to have the greatest impact on the federal landscape in the coming years?
The areas that I see having the most impact on the federal landscape are AI, DataOps, and ubiquitous connectivity through technologies like 5G and StarLink. Artificial intelligence handles how the best operators/analysts work with data and can process data at line speed in a scalable manner. This is especially beneficial if your agency is facing turnover, retirement, talent pipeline challenges, etc.
Additionally, DataOps serves as a scientific framework for connecting vast datasets from diverse sources, ensuring that they are seamlessly meshed, unbiased, and easily accessible to decision makers. Masu. Today’s landscape in which data is scattered across geographically dispersed and often incompatible systems in both IT and OT/IoT poses challenges in seamlessly integrating and consuming information. Adding to this challenge is the presence of data in a variety of formats, with obstacles arising from both file structure and data types. A critical solution lies in the ability to standardize data, the process of converting data into a standardized format and common language to facilitate its use. This normalization process is essential to overcome the complexities associated with disparate data and ensure a more consistent and accessible dataset for making informed decisions.
Finally, ubiquitous access to these innovative tools is paramount for government agencies. Whether responding to disaster relief or engaging in active military operations, the advent of 5G and StarLink will enable organizations to deploy traditional data center capabilities to the edge. For example, having the most knowledgeable analysts on-site, processing raw data in real-time, whether at the U.S. border or on an airlift platform, can speed up the decision-making process. The integration of these technologies promises to make the right information available in the right place at the right time, significantly impacting decision-making.
Let’s talk more about AI. Which applications are seeing the highest demand for AI/ML from federal customers? Can you explain the factors driving that demand?
Two key areas where artificial intelligence is in particularly high demand among federal government customers are intelligence collection and sensor data analysis. In the field of law enforcement, there is a critical need to quickly and efficiently collect and process disparate data from multiple devices in a variety of formats. Integrating AI and DataOps technologies helps accelerate data ingestion, reducing timelines from months to days. This combination allows raw data to be assembled into actionable intelligence that resembles the expertise of experienced agents, making the entire process scalable.
Another important area of demand is sensor data analysis, especially in the US military. An extensive portfolio of sensor collection equipment, including airborne sensors, ground stations, HUMINT (human intelligence), and cyber assets, creates a constant influx of data that impacts warfighters. This data normalization and AI processing capabilities will be critical to providing immediate information to field commanders. The expert AI platform can handle data ingestion rates as high as 100 TB per second, enabling real-time classification and processing, making it a game-changer for commanders in the field and decision-makers at the edge.
We are already seeing news about the next version of 5G. What do you think the potential of 5G will be as the technology develops?
Looking to the future, the next version of 5G has transformative potential, especially in optimizing operations on airline routes and ports. Consider the amount of data generated by aircraft during combat and training flights. That’s several terabytes. The advent of advanced 5G technology will facilitate real-time data transfer of sensor information, including details about weapons, pilots, and aircraft maintenance. This real-time capability provides ground crews with critical information to efficiently resupply aircraft, assess pilot readiness for relaunch, and inform potential emergency repair and maintenance needs. This allows you to make informed decisions. A similar paradigm can be applied to ships entering port, ushering in a new era of data-driven efficiency and decision-making in both air and maritime situations.
One of the most pressing concerns about data in today’s digital environment is the sheer volume of data being processed. What do you think is a more viable solution to tackle this challenge?
Data mesh and smart technologies are currently available in the market to tackle data challenges. The bigger problem is that there are many solutions, most of which only address part of the problem. Hitachi has an extensive framework called Pentaho+. It is a comprehensive platform that can natively ingest over 300 types of data (with open APIs to map many others), with ETL and cataloging capabilities, and AI capabilities for learning. is. How individuals interact with data. There’s also a tiering system called Data Optimizer that can tell you how and where to tier your data based on performance, cost, and other factors. This allows organizations to store more used data on NVMe (Non-Volatile Memory Express) drives for instant access, storage in the cloud on rotating disks, or archive, and access metadata at any time. Metadata can be kept available for replenishment. It’s important to be able to create a global mesh of this data with metadata that tells you where the data resides. This also helps in accessing data, deduplicating data, and replicating it to the right location. This type of comprehensive framework is what all government agencies and departments need to optimize their data operations.
Finally, Phil, how do you think the proliferation of commercial entrants into space is changing the field? What trends and changes do you foresee going forward?
Commercial entry into space is a game changer. The ability to create and deploy technology without the constraints of contracts, fair competition, budgets, or bureaucracy is transformative. Commercial organizations like SpaceX, Virgin, and Blue Origin are developing technology faster than traditional methods, allowing missions and government agencies to quickly pivot as needed. They are announcing technologies like StarLink that will change the ability to communicate in competitive environments. Field commanders and remote users can access data and communicate even in the most difficult environments. While OTA can help advance traditional development in a timely manner, it is not at a scale that commercial organizations can tackle internally at their own cost and risk.