tutorial // Feb 24, 2023
How to Break a Large Array Into Chunks with JavaScript
How to write a function that takes a larger array and breaks it down into an array of smaller chunks.
Getting Started
Because the code we're writing for this tutorial is "standalone" (meaning it's not part of a bigger app or project), we're going to create a Node.js project from scratch. If you don't already have Node.js installed on your computer, read this tutorial first and then come back here.
Once you have Node.js installed on your computer, from your projects folder on your computer (e.g., ~/projects
), create a new folder for our work:
Terminal
mkdir app
Next, cd
into that directory and create an index.js
file (this is where we'll write our code for the tutorial):
Terminal
cd app && touch index.js
Next, we want to init our package.json
file:
Terminal
npm init -f
This will automatically create a package.json
file at the root of our project. The -f
here will "force" the creation, skipping the npm init
wizard. If you prefer to use that, just omit the -f
and answer the prompts it presents.
In the package.json
file that was created, add the field "type": "module"
as a property. This will enable ESModules support and allow us to use the import
statements shown in the code below.
Finally, we need to install one dependency, Faker:
npm i @faker-js/faker
With that in place, we're ready to get started.
Creating a test array
To contextualize our work, first, we're going to generate a large array of data that we want to chunk. Typically, "chunking" an array comes into play when you need to process large amounts of data and breaking that data into chunks is more efficient, or at least, easier to reason about.
/index.js
import { faker } from '@faker-js/faker';
const users = [...(new Array(100))].map((_user, index) => {
return {
_id: index,
name: {
first: faker.name.firstName(),
last: faker.name.lastName(),
},
emailAddress: faker.internet.email(),
telephone: faker.phone.number(),
address: faker.address.streetAddress(),
};
});
In the /index.js
file that we created above, here, we start by importing the named export faker
(denoted by the curly braces wrapping the variable) from the @faker-js/faker
package we installed earlier. This will give us access to Faker which is a tool for generating random, fake data.
Putting it to use, below this, we create a variable users
setting it to an array []
containing the statement ...(new Array(100))
. Here, new Array(100)
is creating a new array of 100 empty items, or if you prefer "placeholders." We convert this array of 100 empty items into an array of undefined
items by using the ...
or "spread" operator which tells JavaScript to "unpack" or "spread out" the contents of the value we're prefixing ...
to into the array wrapping it with.
The end result is that we get an array that looks something like [undefined, undefined, undefined, ...]
100 times over. To make this useful, next, we call the .map()
function on that resulting array. Our goal is to say for each "placeholder" we want to return an object representing a user. Here, we expect the function we pass to .map()
to receive two arguments:
_user
which is an arbitrary name we've assigned to the current item being mapped or iterated over in the array.index
which is the current zero-based index of the current item being mapped or iterated over in the array.
We've put a _
on user to denote that it will not be used. We still need to write it, though, because we need to get access to the index
argument. Using _user
communicates to us "this needs to be here, but we don't use it in the code below."
Inside of the function, all we're doing is returning an object that roughly represents a fake user. For the _id
we repurpose the index
value and then using the Faker package we imported above, generate some fake data that resembles a real-world user.
If you're curious, there's a lot of different APIs offered by Faker for generating data.
That's all we need for our test array. If you add a console.log(users)
in your code and then in your terminal, run node index.js
from the root of your project you will see an array of 100 faker user objects logged out.
Writing a function to "chunk" the array
Now for the important part. While 100 users isn't a crazy amount of data, it's a good test case to show off how this would work for say, thousands or even millions of users (or whatever large array you might have). To keep our work clean, we're going to create a separate file that will store our chunking function:
/chunkArray.js
export default (array = [], chunkSize = 1) => {
const chunks = [];
const arrayToChunk = [...array];
if (chunkSize <= 0) {
return array;
}
while (arrayToChunk.length) {
chunks.push(arrayToChunk.splice(0, chunkSize));
}
return chunks;
};
It's not too complicated, so we've added everything here. Let's step through it.
First, we want to anticipate our function receiving two arguments: array
and chunkSize
. The first will be the array that we want to split into chunks and the second is the number of items we want to hold in each chunk.
Inside of the function, we initialize the array that will hold our chunks as const chunks
. Next, to ensure we don't destroy our source data, we create a variable arrayToChunk
which is just a copy of the array
we passed in (using the ...
spread operator we learned about earlier "copies" or "spreads" the suffixed array into the wrapping array). If we didn't do this, we'd be modifying the actual array
passed in which could (hypothetically) create issues elsewhere in our code.
Next, with this, we first want to check and see if we've been passed a valid chunkSize
(greater than zero). If we haven't we just want to return the unmodified array.
The important part here is the while()
loop. Here, we're saying "while arrayToChunk
has data in it, keep looping, and for each iteration, we want to call the .splice()
method to do two things:"
- Get the items from the array starting with the index
0
and ending with the index matchingchunkSize
(e.g., in our test case, we'll have.splice(0, 10)
) and return them as a new array. - Simultaneously, modify
arrayToChunk
by removing those same items.
In other words, with arrayToChunk.splice(0, chunkSize)
we're saying "grab this chunk from that array." We're not creating a copy of those items, we're extracting them permanently.
With this new array/chunk extracted, we hand it to chunks.push()
to add it to our new chunks
array.
Once we complete our while()
loop (meaning, we've run out of items to chunk) we go ahead and return our chunks
array from our function.
Putting it all together
Now that we have a way to create our chunks and some test data, let's see how this all fits together:
/index.js
import { faker } from '@faker-js/faker';
import chunkArray from './chunkArray.js';
const users = [...(new Array(100))].map((_user, index) => {
return {
_id: index,
name: {
first: faker.name.firstName(),
last: faker.name.lastName(),
},
emailAddress: faker.internet.email(),
telephone: faker.phone.number(),
address: faker.address.streetAddress(),
};
});
const chunkedUsers = chunkArray(users, 10);
console.log(
JSON.stringify(chunkedUsers, null, 2)
);
Back in our /index.js
file, here, we've added a new variable chunkedUsers
set to a call to our chunkArray()
function which we've imported up top. To it, we pass our users
array and our chunkSize
of 10
.
Finally, we log out the stringified version of the resulting chunkedUsers
, passing null, 2
after our new chunkedUsers
array to say pretty-print the string by using 2 spaces for the data (without this, we'd get one massive compressed string—using this convention gives us something that looks more like our text editor or IDE).
With this, from the root of your project in your terminal or command line, run node index.js
to see the resulting chunked array.
Wrapping up
In this tutorial, we learned how to take a large array and break it into smaller chunks. First, we learned how to create a large array with some test data using Faker, and then, how to write a function to help us break our array into chunks. Finally, we learned how to log out our resulting array to verify that our chunking function worked as expected.