In this chapter, we're going to add background jobs to our project. We're going to start by setting up AI SDK, which will allow us to create some long-running tasks, such as querying an AI and getting a response back. Then we're going to call that AI model through a normal API route which will serve as our quote unquote blocking example. Then we're gonna do the exact same thing but this time by running it through an ingest background job and then we're gonna to compare blocking versus non-blocking. So let's start by adding AISDK to our project.
I'm going to be using AISDK version 6. So in my case, I can go ahead and use add latest because this will be version 6 and I'm going to immediately show you that. So inside of my package.json if I search for AI you can see that my version is 6.0.3. If you want to use the same version as me, you can do this. Though I don't think that these minor versions matter too much.
I think that the version 6 is the most important one because between version 4, 5 and 6 there are quite some breaking changes in how you use it. So that's why I'm making you aware of the version that I'm using at the time of making this tutorial. So now let's go ahead and click on documentation here and let's click on providers and models. So this is your choice. You can choose whatever provider you want.
I would highly, highly, highly suggest using Anthropic providers. They are by far the best when it comes to agentic coding and you will definitely feel that in building an app like this which will have to kind of loop itself into tool calling until it reaches a certain result. And a lot of these other providers and models are not as good as Entropic when it comes to that. Same goes for their UI and their generative coding models, right? They are just really, really good.
So I'm going to show you two examples. I'm going to show you how to set up AI SDK Google because it's completely free. And I'm going to show you how to set up Anthropic because that's what I will be using going through. If you're wondering about the total costs for Anthropic to finish this project, it's going to be around $5-10. So nothing too much, but I completely understand if for some of you that is not obtainable for whatever reason.
So I'm going to start with Google since because it's free. So let's start with Google Generative AI provider. Let's click on NPM and let's install it. Again I'm going to use the latest version simply so I know that whatever version that gets installed here is compatible with my AI latest version. So if I search for AI SDK, Google, it's 3.0.1.
So if you want to, you can do this. Now that we have that we have to go ahead and create a very simple API route that will call this. So I'm going to go ahead and create an API folder within my app folder here and then I'm going to create a demo folder and inside of that blocking and inside of the blocking folder route.ts. What we have achieved with this folder structure is the following API route. So local host 3000 API demo blocking.
And if you call a post request to this route right here, we're going to trigger Google's Gemini provider to create something right to generate some text. So let's go ahead and export asynchronous function post like this. And now we have registered this as a post route, right? So it will return something from here. So inside of response right here, what I want to do is I want to add this.
So I'm just going to copy it here. So we can import generate text from AI package which we've just installed and we can import Google from at AI SDK Google like this and then we can return next response dot Jason you can import or you can just do response dot Jason and pass in the response like this. So this is how you would implement a normal Gemini call. The problem is we don't have an API key so this will fail now. So using the link on the screen you can visit Google AI studio and in here we can create one free API key.
So I'm going to name this Polaris dev and let me show you how you can easily create a new project. So Polaris dev create a project and once that project has been created you can go ahead and select it. There we go. It is automatically selected. I will call this Polaris-Dev.
I'm going to click create key. And once we create this key, I'm going to copy it here. And I think I can add it here so API key or here. Let me go ahead and research a bit about how I can directly add usually you can do it through the environment variable but I want to show you how you can exactly do it from here. All right so I think that the way you need to do this is by using create Google generative AI and then defining your own Google like this.
API key and then add it here. So let me copy this again and let's add that right here and then we're going to use Google like this. You can of course add this to your dot environment. I'm going to show you how you can do that in a moment, but let's try it directly with this. So let's go ahead and try this out and I'm going to create a very simple UI so we can test this.
So instead of the app folder, I'm going to create a demo folder and then page.tsx inside. What this will do is this will create a page that we can access on the localhost 3000 forward slash demo. So let's start by adding user client here so we can do some interactivity. Export default function demo page. Let's go ahead and let's return a div.
Let's give this div a class name. Let's do padding. What is the bigger padding? 8 space x 4. And let's call button from components UI button and let's give it a label of blocking.
And then let's go ahead and implement a very simple handle blocking asynchronous function which await fetch API demo blocking with a post method. So this is the exact API route which we have created. You can see the cascading here. API demo blocking. API demo blocking.
As long as it starts with the app folder and goes into the API, you've done it correctly. And then let's just go ahead and give this an on click here and paste it, right? There we go. So I'm gonna go ahead and try it now and let's just make sure we are running npm run dev. And here's the thing, so I actually tried doing this chapter a few times already and every time I tried my Google provider failed.
So that's why I told you if possible use something other than a Google provider it's just not reliable in its current state. I'm gonna try one more time we're gonna see maybe this will be the successful one. Let's see. It's doing something. I'm not sure if it's failing or not.
As you can see, it is definitely hitting that forward slash demo. And there we go. This time we managed to do it. And here it is. Here's a delicious and hearty vegetarian lasagna recipe.
So basically, we just had an API call do a vegetarian lasagna recipe for four people using Gemini 2.5 flash and a free API key. Yeah, so for some reason, many times I've tried this, it just didn't work for me. I don't know why it keeps saying random errors, like my quota was finished, even though I only have one request to Gemini in a whole month. So I'm not sure how that happened. Then it kept saying that my API key was invalid.
So I don't really know. And let me show you how you can add it to your dot environment file. So if you don't want to use it like this, all of that is actually documented here. So you can see that when you scroll down here, you can store it inside of Google generative AI API key. So instead of your .environment.local, add a hashtag AI, add it here, and then you can just add it like this.
There we go. And then you no longer have to really do this. You should then be able to just import Google and use it directly without having to define it. Let's try if it still works. Maybe the problem is in that environment file.
Yeah, it looks like it works now. Great. So you can see that our first example is now working. Great. Now what we have to do is slightly modify our page.tsx into showing that blocking state because so far we've only seen it in the network but I don't think we visually understand what's actually happening.
So because of that I'm just going to implement a super simple use state here which will keep track of loading. So loading and set loading. Then I'm going to set loading to true when we start doing it and set it to false when we finish. And I'm going to go ahead and make this button disabled while it's loading. And if it's loading, I'm gonna show loading.
Otherwise I will show blocking. Like this. So hopefully this will give us a better understanding of what's going on right now. So when I click on blocking you can see how it takes 1, 2, 3, 4, 5, 6, 7, oops, I refreshed. Do you see the point?
The point is that when it comes to long-running tasks like this, you cannot let your user wait for 10, 20, 30 seconds because this is not a super simple AI query, but later it's going to be a super complicated one and it's going to last for maybe up to a minute, right? If we are cascading an entire application. So you can't let your user hang like this, especially with the fact that they can accidentally refresh and they can go back and all of those things, right? This takes too much time and it's not the problem about it taking too much time. It's the problem that we are handling this through a normal blocking method.
Whereas what we should be doing is simply using an API route to trigger a background job and then immediately tell the user, hey, we triggered the background job, you are free to do whatever you want. We are going to alert you when it's finished. You can go out and take a walk, you can shut down your laptop, whatever you want to do, it's going to be ready when you come back. That is what we want to do. So let's go ahead and learn how to implement that.
Using the link on the screen, you can visit Ingest. And in here, let's go ahead and go through their documentation. One great thing about Ingest is that they offer completely a countless setup. So you can just follow Next.js quick start without ever creating an account until it comes to deployment of course. So for now let's just follow Next.js app router solution here.
First things first we're going to do is we're going to install Ingest. So for now I'm going to shut down the app and just do npm install ingest and then I'm going to show you the exact version of Ingest that I'm using since this is an important package. So I'm going to go inside of package.json, search for Ingest and let me give it a moment to install. Here we go. So my Ingest version is 3.48.1, so if you wanted to, you could have used this.
Great. That is step 1 finished. Now what we have to do is we have to run the ingest developer server, which is basically a fast in-memory version of ingest where you can quickly send and view events as function runs. So let's go ahead and copy this and let's run it here. So npx dash dash ignore scripts false ingest CLI at latest dev.
It will be best to find this on your own so you don't have to read off my screen here. And it offers to install a new package so feel free to say yes here and what this will do is it will spin up like a little local instance of ingest which you can visit on localhost 8288 so feel free to visit that and in here you will basically find this ingest dashboard. Right now there is nothing to be found. There are no applications which are connected and there are no functions. There are no runs, nothing.
So that's what we're going to do next. We're going to connect ingest to our project. So in order to do that we need to create source ingest client.ts. I'm going to go ahead inside of my code editor here, source, and then I'm going to do ingest right here and client.ts. There we go.
I'm going to paste this inside. So we are importing the ingest from ingest package and I'm going to call this Polaris. There we go. Once we have that, Let's go ahead and register this in an API route because just by adding this it's not enough and you also need to have your app running at the same time and what you will see now is that this ingest is trying to hit a bunch of endpoints for Ingest. So it's trying to figure out what our project is.
It's trying Netlify functions, Redwood functions, a bunch of things. This is where we are going to register our and once it finds that, that's the only endpoint it's going to hit. So let's go ahead and go inside of app folder, our API folder. Let's create a new folder inside called ingest like that and then inside route.ts. So I'm going to go ahead and just copy the contents from here and paste them in here.
We import serve from ingest next and then we import from our ingest client the client. You can use a shorthand operator alias for this, like that. So this is basically going to source folder ingest client. That's what this is. And in here we export get, post and put.
Save that and now when you scroll down you should see success messages here. It finally managed to find our app. And if you go inside of the Ingest server here and go inside of apps here, you can see it started to auto detect it so it knows it's Next.js, but we still haven't added any functions here so it's still not exactly usable so we now have to create our very first function so that's what I'm gonna do so inside of our source ingest, I will create functions.ts and I will paste this here. So we are importing the ingest client and we are exporting a constant called hello world and inside of here we are simply creating a function with an ID of hello world and an event of test forward slash hello dot world. This is what will be used to trigger the event and we're gonna demonstrate triggering in a moment.
What this will do is it will sleep for one second and then is gonna return back a message. One way we can immediately test whether this works or not is by going inside of the Ingest server. So let's go ahead and let's find our functions here. And if it's not found, you could resynchronize it by rerunning this. So npx ignore scripts and maybe even npm run dev here to make sure everything works fine.
Oh, that's not it. That's not all we have to do. My apologies. Yes, we now have to add that hello world function to our route.ts, my apologies. So let's go inside of app folder, API, ingest route.ts and let's add hello world.
There we go, from Ingest Functions. And now, instead of your Ingest Server, there we go, one function found. And when you click on Functions, you can see it right here. What you can do now is you can invoke it here. So I'm going to go ahead and pass in email here and I'm going to use business ad code with Antonia.com.
It really doesn't matter. Let's just invoke a function and what this is going to do is it's going to run a background job. So what happened was pretty quick so you don't really notice the effect of it right. But it actually run a step called wait a moment in which it waited for one second and then it simply returned a message hello and then the data it received my email right. So you can kind of already guess where we are going with this but now we're going to kind of actualize it right.
So We're going to go ahead and go instead of ingest functions.ts and we're going to create a completely new one. Instead of hello world, let's do demo generate. And instead of this ID, it will be demo generate, This will be demo forward slash generate like that. And then I'm going to go ahead and run this instead of a wait step dot run generate dash text like this. Let's go ahead and call this asynchronous method and let's do await generate text which we can import from AI.
I'm going to call the Google model from AI SDK Gemini 2.5 flash or 2.0 flash. Let me go ahead and see what did I use in my blocking example. Well I can just copy this actually. So I'm just going to do this. There we go.
And in fact, I'm just going to return the output here. So this way I don't really need the event at all. And let's try that out now. I just have to change inside of my source app API ingest route.ts demo generate. Let's refresh this and let's go back to our ingest server.
Let's go inside of functions. Let's click invoke. The data really doesn't matter. It can be empty and click invoke function. And what you see now is the exact same thing happening.
It's generating a haiku or is generating a lasagna recipe for four people, but through a background job. And this is now non-blocking UI, right? So you can now see exactly what it did right here. There we go. This is a vegetarian lasagna.
There we go. And now let's go ahead and plug that into our app right here so you can see how differently it behaves. So I'm going to go inside of app folder API. Inside of demo I'm going to copy the blocking folder and paste it here again. And I'm going to rename it to unblocking or let's call it background like this.
If you have this opened this is just the react cache which sometimes gets confused when you rename folders you can just save that and close it and close the .next folder. Let's go back instead of our background route here. And what I'm going to do inside of here is a little bit different. So instead of directly calling this, I'm just going to do await ingest from ingest client dot send name demo forward slash generate with the data of just an empty object and this will be status started. And then let's remove.
And this is now accessible on this API route. There we go. Now that we have that let's go ahead inside of our app folder demo-page.tsx and in here I'm now going to create a handleBackground function like this, handleBackground which again will call setLoading here but it's going to call the background API route and then I'm going to duplicate this button right here and let's actually okay yeah this will be background and this will call handle background. So let's look at the differences now. Let's repeat our knowledge, right?
Our normal API routes, Okay, this is not a good example. So I'm just going to have a very simple solution. Not the prettiest, but it works. Have them use different loading attributes. So when you use a normal blocking one, you can see how long it takes.
This is how long your user is blocked. Right? Your user cannot do anything right now because we have blocked the entire UI and we are waiting until AI finishes responding and finally it responded and it unblocked the user. User can now do something with that output. Whereas with a background job, you can see that immediately it finished and it just starts running the job in the background.
So it's still not done. It's still creating the recipe, but the user is free to initialize another one, right? There we go. I immediately have another one running or I can do, you know, three of them at once. All of them will execute in their due time.
This is the power of background jobs. And this is just scratching the surface because later Ingest will have something that they call their Agent Kit. And their agent kit is honestly amazing and we're going to use it to orchestrate AI agents into creating loops which will call tools and that will basically create the agent that can create something. We are going to create an entire network of agents which will be able to call each other, communicate with each other and in the end create an actual Next.js project or whatever user describes in the prompt as you saw in the demo. Right, so that's the power of ingest and the power of background jobs.
You can see how quickly this is finished as opposed to this. Don't get me wrong, the speed is the same. It's just a matter of how you handle this process in blocking vs. Non-blocking way. So that's the power of background jobs.
That's what I wanted to demonstrate, right? We are now comparing blocking vs. Non-blocking. Great. So, As I said, I also told you I will show you how to set up Claude as the case.
So Let me go ahead and go through that to end the chapter I would recommend you know doing this even if you don't plan on using Anthropic simply because it is a much better model and it will unlock a lot of new possibilities for you. In fact, Google Gemini actually had problems and complete inability to call tools in the past. I think that's changed now, right? But you can see just how advanced these Anthropic models are and why I prefer them so much. So I'm just going to go ahead and do, let me just expand my terminals here.
Expand my terminals here. NPM install, AISDK Anthropic. Then I'm going to show you the version. So AISDK Anthropic 3.0.1. I believe all providers are on the same version.
Great. And then you can see that inside of here, they basically explain how you can do the exact same thing. And your API key should be stored inside of Anthropic API key. So I suggest you find this in the documentation so you don't just blindly trust me if they change it. So Anthropic API key.
How do you create an Anthropic API key? Well, by using Anthropic Console, you can use the link on the screen. I added $5 to my account. I think in entire development of this project and extensive testing, I spent a maximum of $15. And we won't be doing as much testing in this app.
So I think you should be completely fine with $5 to $10. But again, I completely understand if that's not possible for you. That's why I showed you first how to do it with a free API key. So I'm going to go ahead and call this Polaris Dev. You can of course choose a proper workspace or whatever but I'm just gonna use the default one.
And there it is. API key. And I'm just gonna paste it here. There we go. And then I'm gonna go ahead back here and I'm gonna scroll down until I can find the basic generate text.
Here it is. So I'm just going to replace both my blocking and background generate text to use the Anthropic model now, not from Ingest. From Anthropic. There we go. And I'm going to copy this.
And I'm going to go inside of my demo functions.ts. That's the file. And in here, I'm just going to replace this generate text to use Anthropic. There we go. So both my source ingest functions dot ts and my app folder API demo blocking route dot ts are now modified to use Claude Haiku.
And I have added my Anthropic API key. So let's just give it a shot to see if it's working or if something wrong is happening. So I think I'm seeing no errors here. There we go. So both of this seems to be working just fine.
Here it is. Let's see. Here is a vegetarian lasagna recipe that serves for people. You can see the difference in their outputs. And in fact, I think somewhere here you should be able to see.
There we go. Provider metadata. So this one used Anthropic. Perfect. So it's that easy to actually change providers.
You can of course choose whatever provider you want. If you have an existing API key, you know, for OpenAI, you can definitely use that. That is also a great one. I just personally prefer Anthropic. Whenever I work with this kind of projects, Anthropic provides the best results.
I believe that's all we wanted to do for this chapter. So let's go ahead and let's merge these changes. First things first, let's see all the changes that we actually have. So we confirm we are on the same page. So I have eight changes.
Two of them are package log and package json. Then I added some routes for a background demo, for a blocking demo, the route which registers ingest, page where we have the UI for our blocking and background, right, client which is just an ingest singleton and functions where we actually write the background jobs. So I'm going to go ahead and shut down all of my apps for now and I will simply do git add dot git commit this will be 04 right background jobs and then I'm gonna do git checkout b04 background jobs and then git push origin 04 background jobs and then we're gonna go ahead and review this pull request. Perfect. You can see how I have switched my branch even in the visual part here.
Now let's open GitHub and open a pull request. So as usual here on the top I have a new branch so I'm going to go ahead and open it and create a pull request and let's see CodeRabbit's summary. Let's read the summary by CodeRabbit. New features. We added a new demo page with interactive examples comparing blocking and background operations.
We implemented a synchronous API endpoint for immediate text generation requests, the blocking one, and we implemented an asynchronous API endpoint for queuing background tasks. We also added a backend infrastructure to process and manage queued operations. So let's take a look at the comments. Most of the comments are due to the fact that this is just initial demo code. So we didn't handle errors, we didn't handle retries, absolutely nothing, right.
You can see that we always return started here even though this can technically fail As long as you can evade something it can fail, right? So it's those kind of things that we should improve later on but since most of these will actually be deleted later it doesn't matter, right? Otherwise obviously all very valid concerns here. Even the invalid comments if noticed that too. Same thing here.
So some improvements we should do, but this is just for demo. You can see by the name, right? Loading 2. That's not the industry standard when it comes to naming. And yes, in here it actually warns us that this wouldn't work in production, which is true.
There is a separate process of enabling ingest in production. Luckily, Vercel offers one-click setup for that. So we will worry about that when the deployment step comes. Amazing. Other than that, let's go ahead and merge this pull request.
As always, I'm not going to delete my branch so I have a clear history of all of my chapters. You can see that I can now go back to anything I want. Great! So let's go back to my main branch and get full origin main. So I have pulled those remote changes we just merged and I always like to confirm that everything is fine by going inside of here, graph and just confirming that the same behavior is present.
We checked out to do 04 and then we merge that back into main. Amazing. So I believe that marks the end of this chapter. We've set up AI SDK, created a blocking API route, and then we did the same thing with a queued background job. And finally, we compared blocking versus non-blocking and its benefits.
And I hope that I explained why we need background jobs, especially in apps like this. Amazing, amazing job and see you in the next chapter.