
This chapter covers
- Persisting data into a relational database
- Streaming data using Apache Kafka
- Incorporating event-driven principles
- Analyzing data to monitor the location using Spark
The last chapter laid the foundation for our information technology asset management (ITAM) system. However, this application will not fulfill our requirements without data. Data is the lifeblood of every application. That is what this chapter is all about: the various ways we can use generative AI to create data, stream data, transform data, react to data, and learn from data.
Perceptive individuals may have noticed in the last chapter that our data access pattern would not have worked as it was incomplete. The opening section of this chapter will address this. After that, we will set up our database, fix the classes that access this data, and load some sample data to use in the rest of the chapter.