Navigating S3 Using the AWS SDK for Javascript: A Guide

In this post, we'll look at how to navigate S3 using the AWS SDK for Javascript and a build a simple practical project to demonstrate with examples.

The AWS SDK for Javascript is an API that allows developers to work with Amazon services like Simple Storage Service (S3) from the client side. It’s a powerful tool for building apps and services.

I recently had to use it on a project and thought I would share how to use it as well as some practical examples with S3.

Install AWS SDK for Javascript

To install the AWS SDK for Javascript, you’ll need to run npm install aws-sdk

How To Use The SDK

The best reference for using the AWS SDK for Javascript is the API Reference Guide.

Once on this page choose S3 in the left column.

Here you’ll find all the methods needed to interact with S3 as well as examples of each.

AWS SDK for JavaScript S3 Examples

Let’s create a simple project to perform these actions.

We’ll give examples of:

  1. Uploading a file
  2. Downloading a file using a Presigned URL
  3. Reading and writing tags to an object

By the end you’ll have the following functionality:

We’ll use the React library to get things spun up quickly.

  1. First, create a new project by running npx create-react-app testing-s3 and changing into that directory cd testing-s3.

  2. Sign in to your AWS account and create a new IAM user with the appropriate S3 permissions. Since this is a demo I’ll create a user called svc.s3 with programmatic access (save your keys). And for permissions, I’ll give it AmazonS3FullAccess.

  3. Create a new bucket.

  4. Create an .env file in the root directory of your React project and add the following lines:

  1. Run npm install aws-sdk to install the AWS SDK. Will also need npm install util.

  2. Run npm start and open the browser to http://localhost:3000.

  3. Finally, set the CORS policy on your S3 bucket to allow the browser to access it.

    "AllowedHeaders": [ "*" ],
    "AllowedMethods": [ 
      "HEAD", "GET", "PUT", "POST", "DELETE"
    "AllowedOrigins": [ "localhost:3000"]

Example 1: Uploading a File

First, let’s set up a way to upload files to S3.

Create a new file in the /src folder called awsService.js and add the following lines to it:

import AWS from 'aws-sdk'

const S3_BUCKET = process.env.REACT_APP_BUCKET_NAME
const REGION = process.env.REACT_APP_REGION

    accessKeyId: process.env.REACT_APP_AWS_ACCESS_KEY,
    secretAccessKey: process.env.REACT_APP_AWS_SECRET_KEY

const s3Client = new AWS.S3({
    params: { Bucket: S3_BUCKET },
    region: REGION

export const uploadFile = async (file) => {

    if (file === null) {

    const params = {
        Body: file,
        ContentType: file.type

    try {
        const s3Response = await s3Client.putObject(params).promise()
        return s3Response
    } catch (e) {
        return e

Pretty self-explanatory. A few things to note:

  1. We are adding the Region and Bucket to the client itself. This way we don’t have to redefine these for each function.
  2. We are passing the entire file object. This way we can get the file name and file type from it.

Next, let’s update the App.js file to include a way to attach a file and upload it to S3.

import { useState } from 'react'
import { uploadFile } from './awsService'
import './App.css'

function App() {
    const [file, setFile] = useState(null)

    const handleFileInput = async(e) => {

    const uploadFileToS3 = async(e) => {
        await uploadFile(file)

    return (
        <div className='App' style={{ marginTop: '50px'}}>
            <label htmlFor="myFile" style={{ marginRight: '10px' }}><strong>1. Upload File to S3</strong></label>
            <input type='file' id="myFile" name='myFile' onChange={handleFileInput} />
            <input type='submit' value='Upload' onClick={uploadFileToS3} />

export default App

A few things to note:

  1. We are using a plain HTML upload input. We set an onChange event to set the file state and store it in state using the useState hook. React aside, the file upload feature is vanilla javascript. It can be accessed in the event with[0].
  2. When the submit button is clicked, we call the uploadFile function which uploads the file to S3.

Example 2: Downloading a File With A PreSigned URL

To download a file, we’ll add another function to the awsService.js.

export const generateDownloadLink = async (key) => {
    const url = s3Client.getSignedUrl('getObject', {
        Key: key,
        Expires: 3600
    return url;

This generates a pre-signed URL for the file. We’ll use this to download the file.

Why aren’t we downloading it directly? Well, because getObject doesn’t download the object. It “gets” it. You have to implement that yourself and can easily be done with something like Node and the fs library.

One workaround for React is to generate this presigned URL on page load or after upload (using useEffect), put the URL in state, and add it as the href of a link or click of a button already on the page.

But for our purposes, we’ll just generate the Url.

So back to our App.js file, we’ll add the following to it:

import { uploadFile, generateDownloadLink } from './awsService'

const [downloadLink, setDownloadLink] = useState('')

const downloadFileFromS3 = async(e) => {
    const downloadURL = await generateDownloadLink('your-file-name')

<br /><br />
<label style={{ marginRight: '10px' }}><strong>2. Download File From S3</strong></label>
<input type='submit' value='Generate URL For Download' onClick={downloadFileFromS3} />
<div>Download URL: <a href={downloadLink}>{downloadLink}</a></div>

Example 3: Reading and Writing S3 Object Tags

Finally, perhaps you want to tag a file in S3 and mark it either checked out or checked in.

To tag my file, I’ll add the following function to the awsService.js file:

export const tagFile = async (key, checkedOut) => {
    const params = {
        Key: key,
        Tagging: {
            TagSet: [
                    Key: 'checkedOut',
                    Value: checkedOut

    try {
        const s3Response = await s3Client.putObjectTagging(params).promise()
        return s3Response
    } catch (e) {
        return e

This function will tag the file with either a “true” or “false” value based on the checkedOut parameter.

And then in App.js I’ll add buttons to toggle the CheckedOut tag between true and false.

import { uploadFile, generateDownloadLink, tagFile } from './awsService'

const handleFileCheckedOut = async(e) => {
    await tagFile('your-file-name', "true")

const handleFileCheckedIn = async(e) => {
    await tagFile('your-file-name', "false")

<br /><br />

<label style={{ marginRight: '10px' }}><strong>3. Tag File</strong></label>
<input type='submit' value='Tag File Checked Out' onClick={handleFileCheckedOut} />
<input type='submit' value='Tag File Checked In' onClick={handleFileCheckedIn} />

So if you click the Tag File Checked Out button it will set the tag checkedOut to “true.” If you click the Tag File Checked In button, it will set the tag checkedOut to “false.”

And then to “Get” the S3 object’s tags we’ll create a new function:

export const getTags = async (key) => {
    const params = {
        Key: key

    try {
        const s3Response = await s3Client.getObjectTagging(params).promise()
        return s3Response
    } catch (e) {
        return e

And in App.js we’ll add a button that will display the value of the tag by telling us if the file is checked out or not.

import { uploadFile, generateDownloadLink, tagFile, getTags } from './awsService'

const [checkedOut, setCheckedOut] = useState(false)

const handleGetTags = async(e) => {
    const tags = await getTags('aws-s3-logo.png')

    if (tags.statusCode === 404) {
        console.log('Tags do not exist')

    const checkedOut = tags.find(o => o.Key === 'checkedOut').Value

<br /><br />

<label style={{ marginRight: '10px' }}><strong>4. Get Tags</strong></label>
<input type='submit' value='Is File Checked Out:' onClick={handleGetTags} />
<div>Checked Out: {checkedOut}</div>

Final code

The final code for this exercise can be found in Github.

How To Get AWS Certified in 2022: Exact Resources Used

Resources Mentioned

AWS Certified Solutions Architect:
John Bonson Practice Exams:

WhizLabs -


** This article may contain affiliate links. Please read the affiliate disclaimer for more details.

About Me Author

Travis of

Travis Media

Who Am I? I was 34 years old in a job I hated when I decided to learn to code. Read More

You May Also Like