Table Of Contents


Aws S3 with React and Node

Aws S3 with React and Node

profile-image

Milan Poudel

May 24, 2024

In this part, we will see how we can use the previously created S3 bucket in our NodeJS and React app. If you haven't read our previous part: It's here, AWS S3: Creating a policy and IAM user for s3, where we set up our policies for s3 and the user or web app that can access the following s3 service.


Creating a simple Upload form:


Let's first create a simple upload UI. Assuming that we have all the basic setups of a React Project done, we can create the following component to create a new post.


//pages/create-post.tsx
import React from 'react'
const UploadPost = () => {
  const [title, setTitle] = React.useState('')
  const [file, setFile] = React.useState<File | null>(null)

  const handleSubmit = (e: React.FormEvent<HTMLFormElement>) => {
    e.preventDefault()
    console.log('Form Submitted')
  }
  return (
    <div className="container">
      <div>
        <form className="form" onSubmit={handleSubmit}>
          <h1>New Post Creator</h1>
          <input type="text" placeholder="Title" required className="form-field" 
          value={title} onChange={(e) => setTitle(e.target.value)} />
          <input type="file" placeholder="Upload Image" accept="image/*" required className="form-field"onChange={(e) => setFile(e.target.files ? e.target.files[0] : null)} />
          
           <button type="submit" className="form-button">
            Submit
          </button>
        </form>
      </div>
    </div>
  )
}
export default UploadPost


Styling our upload component:


The following are the CSS styles of our image upload form.


.container {
  height: 100vh;
  display: flex;
  justify-content: center;
  align-items: center;
  background: #f1f1f1;
}
.form {
  margin: 0 auto;
  width: 50%;
  padding: 30px;
  background: #fff;
  border-radius: 5px;
  box-shadow: 0 0 10px rgba(0, 0, 0, 0.1);
  width: 470px;
  text-align: center;
}
.form h1 {
  margin-bottom: 20px;
  color: #333;
  font-size: 24px;
  font-weight: 700;
}

.form-field {
  margin-bottom: 20px;
  border: 1px solid #ccc;
  border-radius: 5px;
  width: 100%;
  padding: 10px;
}
.form-button {
  background: #333;
  color: #fff;
  border: none;
  border-radius: 5px;
  padding: 10px 20px;
  cursor: pointer;
}


This is how our image uploader will look like:





Installing s3 packages and a multer:


Now let's focus on our backend. In the backend part, we need to install a multer package to handle the file that we get from the frontend. Since our request body will contain binary image file, we will send form data from frontend and multer will help us to get the binary file. We will need to install these three libraries for our upload process to work.


npm install @aws-sdk/client-s3 
npm install @aws-sdk/s3-request-presigner 
npm install multer


We might also need to install the types packages related with these packages in case typescript throws an error. In my case, I also had to install the following package: @types/multer


Util and config file for multer and s3:


We will create a util folder where we will define the config for our s3 and multer. We will import these configs into our required files. Remember the bucket name, access key, and secret access keys we generated earlier, we need to define all of them in our .env file and use them here.


// utils/imageUpload.ts
import multer from 'multer'
import { S3Client } from '@aws-sdk/client-s3'
import dotenv from 'dotenv'
dotenv.config()

const storage = multer.memoryStorage()

//configuring multer with memory storage and limit
export const uploadMulter = multer({ storage, limits: { fieldSize: 10 * 1024 * 1024 } })

const bucketRegion = process.env.BUCKET_REGION!
const awsAccessKey = process.env.AWS_ACCESS_KEY!
const AWS_SECRET = process.env.AWS_SECRET!

//initializing s3 with config and exporting
export const s3 = new S3Client({
  credentials: {
    accessKeyId: awsAccessKey,
    secretAccessKey: AWS_SECRET
  },
  region: bucketRegion
})



In the routes folder, we are going to add a new endpoint for adding a new post. Hence, we are also submitting form data with a binary file, so we also need to use multer function as a middleware in our routes.


import express from 'express'
import { addPost } from '@controllers/post'
import { uploadMulter } from '@utils/imageUpload'

const router = express.Router()

//this file name defined inside multer should be same as name sent from frontend
router.post('/', uploadMulter.single('postImage'), addPost)

export default router



Creating an "AddPost" handler:


In the controller, our main function will handle the async logic of posting images to s3 and returning overall post data.


import { GetObjectCommand, PutObjectCommand } from '@aws-sdk/client-s3'
import { getSignedUrl } from '@aws-sdk/s3-request-presigner'
import { s3 } from '@utils/imageUpload'
import { Request, Response } from 'express'

const bucketName = process.env.BUCKET_NAME!

export const addPost = async (req: Request, res: Response) => {
  try {
    const { title } = req.body
    const postImage = req.file
    if (!title || !postImage) {
      return res.status(400).json({ message: 'Please fill all the fields' })
    }
    const uploadParams = {
      Bucket: bucketName,
      Key: req?.file?.originalname + '-' + Date.now(),
      Body: req?.file?.buffer,
      ContentType: req?.file?.mimetype
    }
    await s3.send(new PutObjectCommand(uploadParams))

    const getObjectParams = {
      Bucket: bucketName,
      Key: uploadParams.Key
    }
    const command = new GetObjectCommand(getObjectParams)
    const url = await getSignedUrl(s3, command, { expiresIn: 518400 })
    return res.status(201).json({
      message: 'Post added successfuly',
      data: {
        title,
        postImage: url
      }
    })
  } catch (e) {
    console.log('hey error while adding post', e)
    return res.status(500).json({ message: 'Something went wrong while performing an action.Please try again' })
  }
}


Here in this "addPost" controller, we will check for the incoming image and title. If they are present, then we will upload the image to s3 with the uploadParams following config:


  • bucket name where it will be uploaded
  • the key that gives a unique identity to the image,
  • and the body which itself contains the raw image file.


Here, the function "s3.send()" will post the image into our s3 bucket. We will be immediately retrieving that image as a pre-signed URL which allows us to access the file without making it public. Here, the time "51800" stands for the time after which it will expire which is 6 days.


Note: Here since we are not using any databases, and it's just an example project, so we are immediately retrieving the image url as soon as we post it.


In the main server file" server.ts", we will import the post route:


// server.ts
import postRoutes from '@routes/post'

app.use('/api/posts', postRoutes)


Modifying "handleSubmit" and adding axios request:


Now our backend setup has been done. Let's edit our front end a bit to handle the post-upload. We will modify our "handleSubmit" function.


  const handleSubmit = (e: React.FormEvent<HTMLFormElement>) => {
    e.preventDefault()
    axios
      .post(`${baseUrl}/posts`, { title, postImage: file }, 
      { headers: { 'Content-Type': 'multipart/form-data' } })
      .then((res) => {
        alert(res?.data?.message)
        console.log(res?.data?.data) // this contains our post data with title and image
      })
      .catch((e) => {
       alert(e?.response?.data?.message || "Error from Server")
        console.log(e)
      })
  }


In the front end, we set up our axios with the base URL where our node server runs. We define our "Content-Type" as multipart/form-data in headers since we are sending a binary file. When we click submit, we will get the following response:


{
    "message": "Post added successfuly",
    "data": {
        "title": "My title",
        "postImage": "...." // s3 presigned url of our image
    }
}


Hence, we successfully posted our images to the s3 bucket and retrieved them using a pre-signed URL from our node js server.


In conclusion:


Setting up s3 in React and Node is relatively easy. There are some gotchas with initial settings of our policies and "IAM" user which is a recommended way to set it since we want authorized users or web apps to access our web services. Hence, s3 has been an industry standard service for storage and learning it is really valuable and easy.


Programming | Coding | Learning

Subscribe to learn about new technology and updates. Join over 1000+ members community to stay up to date with latest articles.

© 2024 Code With Milan. All rights reserved.