Saturday 22 February 2020

File Upload into AWS S3 using Salesforce LWC leveraging AWS SDK

Recently I was building a demo where AWS S3 bucket to be used as a document repository. Actual requirement is much bigger but I am going to demonstrate a core piece of it i.e file upload to AWS S3 bucket through Salesforce LWC (Lightning Web Component) using AWS SDK Javascript.

You must be wondering what is AWS SDK?
The AWS SDK for JavaScript enables you to directly access AWS services from JavaScript code running in the browser. Authenticate users through Facebook, Google, or Login with Amazon using web identity federation. Store application data in Amazon DynamoDB, and save user files to Amazon S3.
  1. Direct calls to AWS services mean no server-side code (Apex) is needed to consume AWS APIs.
  2. Using nothing but common web standards - HTML, CSS, and JavaScript - you can build full-featured dynamic browser applications.
  3. No Signature Version 4 Signing Process is needed as SDK does it for you internally.
For more information, follow AWS SDK Javascript

Prerequisite: You should have
  1. AWS Account
  2. S3 bucket configured
  3. AWS Access Key Id
  4. AWS Secret Access Key
For more information, follow UserGuide Credentials Access Key

Its demo time:

In order to make this demo simple, I have created a file upload component where in a button click I am uploading the file to AWS S3 bucket.

All AWS related configuration are kept in a custom metadata type called AWS_Setting__mdt.

fileupload_aws_s3bucket.html will display a screen as below:



 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
<!-- fileupload_aws_s3bucket.html -->
<template>
    <lightning-card variant="Narrow" title="AWS S3 File Uploader" style="width:30rem" icon-name="action:upload">
        <div class="slds-m-top_medium slds-m-bottom_x-large">
            <!-- Single file -->
            <div class="slds-p-around_medium lgc-bg">
                <lightning-input type="file" onchange={handleSelectedFiles}></lightning-input>
                {fileName}
            </div>
            <div class="slds-p-around_medium lgc-bg">
                <lightning-button class="slds-m-top--medium" label="Upload to AWS S3 bucket" onclick={uploadToAWS}
                    variant="brand">
                </lightning-button>
            </div>
        </div>
        <template if:true={showSpinner}>
            <lightning-spinner size="medium">
            </lightning-spinner>
        </template>
    </lightning-card>
</template>


fileupload_aws_s3bucket.js : Code is very self -explanatory but I am trying to explain it.

At first we need add AWS SDK JS file in a Salesforce static resource. Here is the link of AWS SDK JS file.
To use a JavaScript library from a third-party site, add it to a static resource, and then add the static resource to our component. After the library is loaded from the static resource, we can use it as normal.
Then use import { loadScript } to load the resources in LWC component in renderedCallback hook.
By default, we can’t make WebSocket connections or calls to third-party APIs from JavaScript code. To do so, add a remote site as a CSP Trusted Site.
The Lightning Component framework uses Content Security Policy (CSP), which is a W3C standard, to control the source of content that can be loaded on a page. The default CSP policy doesn’t allow API calls from JavaScript code. We change the policy, and the content of the CSP header, by adding CSP Trusted Sites.
In this case I have added *.amazonaws.com in CSP. Special thanks to Mohith (@msrivastav13) who helped me to point out what domain/url to be added under CSP.


Once SDK loaded, I am fetching AWS related configuration from AWS_Setting__mdt by assigning metadata record Id to awsSettngRecordId (its both dynamic and reactive) which interns invoke @wire service to provisions data.

From @wire service, initializeAwsSdk method is getting called to initialize AWS SDK based upon configuration data received.

Since SDK is properly initialized, now we are ready to upload documents. On click event from button I am uploading documents through uploadToAWS method. Then it is actually calling SDK's predefined method putObject in order to upload the document to S3 bucket. Please refer AWS.S3 for more information related to supported methods by the SDK .

Please Note : Access Id and Secret Key is open for all to see.
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
/* fileupload_aws_s3bucket.js */
/* eslint-disable no-console */
import { LightningElement, track, wire } from "lwc";
import { getRecord } from "lightning/uiRecordApi";
import { loadScript } from "lightning/platformResourceLoader";
import AWS_SDK from "@salesforce/resourceUrl/AWSSDK";

export default class Fileupload_aws_s3bucket extends LightningElement {
  /*========= Start - variable declaration =========*/
  s3; //store AWS S3 object
  isAwsSdkInitialized = false; //flag to check if AWS SDK initialized
  @track awsSettngRecordId; //store record id of custom metadata type where AWS configurations are stored
  selectedFilesToUpload; //store selected file
  @track showSpinner = false; //used for when to show spinner
  @track fileName; //to display the selected file name

  /*========= End - variable declaration =========*/

  //Called after every render of the component. This lifecycle hook is specific to Lightning Web Components,
  //it isn’t from the HTML custom elements specification.
  renderedCallback() {
    if (this.isAwsSdkInitialized) {
      return;
    }
    Promise.all([loadScript(this, AWS_SDK)])
      .then(() => {
        //For demo, hard coded the Record Id. It can dynamically be passed the record id based upon use cases
        this.awsSettngRecordId = "m012v000000FMQJ";
      })
      .catch(error => {
        console.error("error -> " + error);
      });
  }

  //Using wire service getting AWS configuration from Custom Metadata type based upon record id passed
  @wire(getRecord, {
    recordId: "$awsSettngRecordId",
    fields: [
      "AWS_Setting__mdt.S3_Bucket_Name__c",
      "AWS_Setting__mdt.AWS_Access_Key_Id__c",
      "AWS_Setting__mdt.AWS_Secret_Access_Key__c",
      "AWS_Setting__mdt.S3_Region_Name__c"
    ]
  })
  awsConfigData({ error, data }) {
    if (data) {
      let awsS3MetadataConf = {};
      let currentData = data.fields;
      //console.log("AWS Conf ====> " + JSON.stringify(currentData));
      awsS3MetadataConf = {
        s3bucketName: currentData.S3_Bucket_Name__c.value,
        awsAccessKeyId: currentData.AWS_Access_Key_Id__c.value,
        awsSecretAccessKey: currentData.AWS_Secret_Access_Key__c.value,
        s3RegionName: currentData.S3_Region_Name__c.value
      };
      this.initializeAwsSdk(awsS3MetadataConf); //Initializing AWS SDK based upon configuration data
    } else if (error) {
      console.error("error ====> " + JSON.stringify(error));
    }
  }

  //Initializing AWS SDK
  initializeAwsSdk(confData) {
    const AWS = window.AWS;
    AWS.config.update({
      accessKeyId: confData.awsAccessKeyId, //Assigning access key id
      secretAccessKey: confData.awsSecretAccessKey //Assigning secret access key
    });

    AWS.config.region = confData.s3RegionName; //Assigning region of S3 bucket

    this.s3 = new AWS.S3({
      apiVersion: "2006-03-01",
      params: {
        Bucket: confData.s3bucketName //Assigning S3 bucket name
      }
    });
    this.isAwsSdkInitialized = true;
  }

  //get the file name from user's selection
  handleSelectedFiles(event) {
    if (event.target.files.length > 0) {
      this.selectedFilesToUpload = event.target.files[0];
      this.fileName = event.target.files[0].name;
      console.log("fileName ====> " + this.fileName);    }
  }

  //file upload to AWS S3 bucket
  uploadToAWS() {
    if (this.selectedFilesToUpload) {
      this.showSpinner = true;
      let objKey = this.selectedFilesToUpload.name
        .replace(/\s+/g, "_") //each space character is being replaced with _
        .toLowerCase();

      //starting file upload
      this.s3.putObject(
        {
          Key: objKey,
          ContentType: this.selectedFilesToUpload.type,
          Body: this.selectedFilesToUpload,
          ACL: "public-read"
        },
        err => {
          if (err) {
            this.showSpinner = false;
            console.error(err);
          } else {
            this.showSpinner = false;
            console.log("Success");
            this.listS3Objects();
          }
        }
      );
    }
  }

  //listing all stored documents from S3 bucket
  listS3Objects() {
    //console.log("AWS -> " + JSON.stringify(this.s3));
    this.s3.listObjects((err, data) => {
      if (err) {
        console.log("Error", err);
      } else {
        console.log("Success", data);
      }
    });
  }
}

56 comments:

  1. Thanks Avijit da for the code

    ReplyDelete
  2. I have used the above code and called inside a aura component so that I can call on quick action button. I am getting an error as Cannot read property 'putObject' of undefined. Can you help me out on this.

    ReplyDelete
    Replies
    1. You need to give permission to ur s3 bucket .

      Delete
    2. can you tell us more on what permissions could be missing on s3 bucket?

      Delete
  3. ccess to XMLHttpRequest at 'https://akhoslaawsbucket.s3.us-east-2.amazonaws.com/ps4_games_list.txt' from origin 'https://abhisheklightning-dev-ed.lightning.force.com' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.

    ReplyDelete
    Replies
    1. You need to add a CORS configuration on your S3 bucket to allow origin 'https://abhisheklightning-dev-ed.lightning.force.com' to make calls to your S3 endpoint. https://docs.aws.amazon.com/AmazonS3/latest/user-guide/add-cors-configuration.html

      For example:



      https://abhisheklightning-dev-ed.lightning.force.com
      PUT
      POST
      DELETE
      *

      Delete
    2. Hi Abhishek

      I have the same issue i was wondering if the solution above helped you
      Thanks

      Delete
    3. Hi Abhishek, i m getting the same issue ,
      Access to XMLHttpRequest at 'https://devcom131-dev-ed.my.salesforce.com/services/data/v32.0/query/?q=SELECT%20Id,%20NamespacePrefix%20FROM%20PackageLicense%20where%20NamespacePrefix%20in%20(%27vlocity_cmt%27,%20%27vlocity_ins%27)' (redirected from 'https://devcom131-dev-ed.lightning.force.com/services/data/v32.0/query/?q=SELECT%20Id,%20NamespacePrefix%20FROM%20PackageLicense%20where%20NamespacePrefix%20in%20(%27vlocity_cmt%27,%20%27vlocity_ins%27)') from origin 'https://devcom131-dev-ed.lightning.force.com' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.

      Did you resolve it?

      Delete
  4. the access id and secret key will be open for all to see.. how to avoid that?

    ReplyDelete
    Replies
    1. +1 here.
      The only solution I found is to do manual upload/download using presigned urls, but if this is possible, it would be great

      Delete
  5. Throws a 403 forbidden for me.

    ReplyDelete
  6. Hello. I am following all the steps, and I am getting an "Undefined" error when loading the script (loadScript), Any idea of why is that happening?
    Some googling says that is related to "LockerService" from Saleforce. Have you encounterd that issue?

    ReplyDelete
    Replies
    1. Hi Avijit, I am facing the same issue the component is on Opportunity but whenever the component is loaded, in console you will find error -> "Undefined" (line 31.) of your code. So, any idea how to resolve this?

      Delete
    2. OK i resolved the issue it was because i was using old Aws sdk js file

      Delete
  7. Hi Avijit, does this code has any file size limitations?

    ReplyDelete
    Replies
    1. Technically no limit. I have uploaded upto 300 MB as per my need.

      Delete
  8. Can I create free S3 Bucket ?

    ReplyDelete
    Replies
    1. Yes through AWS Free Tier plan (if u don't have commercial plan).

      https://aws.amazon.com/free/?all-free-tier.sort-by=item.additionalFields.SortRank&all-free-tier.sort-order=asc

      Delete
  9. thank you for sharing......How can I take a versions of file in salesforce
    Can you help me in that?

    ReplyDelete
    Replies
    1. Versions are being maintained in S3 buck only. You may check/consultant with any AWS expert.

      Delete
  10. Thanks for sharing the .. I followed the script. But my upload is not working. I see the spinner forever. any idea ?

    ReplyDelete
  11. Hi Avijit !!! Great post. I have a quick question. Will this code work on SF Mobile App assuming , we are using the OOTB mobile App for SF

    ReplyDelete
  12. I have followed your tutorial but I am getting this error "has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource."
    Will you suggest me suggestion for solve this.

    ReplyDelete
  13. You need to work with AWS admin to allow CORS.

    ReplyDelete
    Replies
    1. Thank you for your reply. But I can't figure out what is the "AllowedOrigins" for my org. Will you please help me.
      *
      "AllowedOrigins": [
      "https://www.example.org"
      ],"
      *
      I have found this here: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/cors.html

      Delete
    2. For example:

      https://avijitgorai-dev-ed.lightning.force.com
      PUT
      POST
      DELETE
      *

      Delete
    3. Thank you very much. Solved that problem. But I am facing a new problem now. I have received 403 error. Received:
      " AccessDenied: Access Denied
      at constructor.extractError....
      at constructor.callListeners...
      at constructor.emit.....
      at constructor.emitEvent....
      at constructor.e...
      at i.runTo...
      at eval...
      at constructor.eval...
      at constructor.callListeners..."

      Delete
    4. One thing, I was facing problem before to use your suggested javascript SDK library for AWS S3 in this tutorial. Then I have used another source code Which is "AWS SDK for JavaScript v2.879.0"

      Delete
    5. Always take the latest version.

      Delete
    6. I am using latest stable version now. But I am still getting AccessDenied. Is it a problem for Salesforce site or AWS site?

      Delete
    7. Problem solved. We have found problem in AWS configuration. Thank you for your support.

      Delete
  14. Glad that your issue got resolved.

    ReplyDelete
  15. Hi Avijit,
    Thanks for for sharing. But i am also getting same problem "putObjet Undefined". Can you provide latest AWS SDK for JavaScript v2.879.0.js file link. I tried several ways to get it.

    ReplyDelete
    Replies
    1. https://sdk.amazonaws.com/js/aws-sdk-2.953.0.min.js


      https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/

      Delete
    2. thanks avijit for your response

      Delete
  16. Hi
    I am not able to upload the file spinner is continuously moving and no error is coming in consoles please help me.

    ReplyDelete
    Replies
    1. Hi Avijit
      I tired several ways please guide me to get it done. Its very critical for me.

      Delete
    2. renderedCallback() {
      if (this.isAwsSdkInitialized) {
      return;
      }
      loadScript(this, AWS_SDK + '/aws-sdk-2.1171.0.min.js')
      .then(() => loadScript(this, AWS_SDK + '/aws-sdk-2.1171.0.min.js'), (error) => {
      console.log(error);
      })
      .then(() => {
      console.log("All JS are loaded. perform any initialization function.");
      }, (error) => {
      console.log(error);
      })
      }
      @wire(getGeneralS3Config)
      getGeneralS3Config({ error, data }) {
      if (data) {
      let awsS3MetadataConf = {};
      awsS3MetadataConf = {
      s3bucketName: data[0].Bucket_Name__c,
      awsAccessKeyId: data[0].Access_Key__c,
      awsSecretAccessKey: data[0].Secret_Access_Key__c,
      s3RegionName: data[0].Region_Name__c
      };
      this.initializeAwsSdk(awsS3MetadataConf); //Initializing AWS SDK based upon configuration data
      } else if (error) {
      console.error("error ====> " + JSON.stringify(error));
      }
      }
      initializeAwsSdk(confData) {
      console.log('confData',confData);
      const AWS = window.AWS_SDK;
      console.log('AWS',AWS);
      AWS.config.update({
      accessKeyId: confData.awsAccessKeyId, //Assigning access key id
      secretAccessKey: confData.awsSecretAccessKey //Assigning secret access key
      });

      AWS.config.region = confData.s3RegionName; //Assigning region of S3 bucket
      //console.log('AWS',AWS);
      this.s3 = new AWS.S3({
      apiVersion: "2006-03-01",
      params: {
      Bucket: confData.s3bucketName //Assigning S3 bucket name
      }
      });
      this.isAwsSdkInitialized = true;
      }

      //get the file name from user's selection
      handleSelectedFiles(event) {
      if (event.target.files.length > 0) {
      this.selectedFilesToUpload = event.target.files[0];
      this.fileName = event.target.files[0].name;
      console.log("fileName ====> " + this.fileName); }
      }

      //file upload to AWS S3 bucket
      uploadToAWS() {
      if (this.selectedFilesToUpload) {
      this.showSpinner = true;
      console.log('this.selectedFilesToUpload.name',this.selectedFilesToUpload.name);
      console.log('this.selectedFilesToUpload.name',this.selectedFilesToUpload.type);
      let objKey = this.selectedFilesToUpload.name
      .replace(/\s+/g, "_") //each space character is being replaced with _
      .toLowerCase();
      console.log('this.selectedFilesToUpload.name',objKey);
      console.log('this.selectedFilesToUpload',this.selectedFilesToUpload);
      //starting file upload
      this.s3.putObject(
      {
      Key: objKey,
      ContentType: this.selectedFilesToUpload.type,
      Body: this.selectedFilesToUpload,
      ACL: "public-read"
      },
      err => {
      if (err) {
      this.showSpinner = false;
      console.log('err',err);
      } else {
      this.showSpinner = false;
      console.log("Success");
      this.listS3Objects();
      }
      }
      );
      }
      }

      //listing all stored documents from S3 bucket
      listS3Objects() {
      //console.log("AWS -> " + JSON.stringify(this.s3));
      this.s3.listObjects((err, data) => {
      if (err) {
      console.log("Error", err);
      } else {
      console.log("Success", data);
      }
      });
      }
      }

      this is the code that i have used and
      Error is coming - Uncaught (in promise) TypeError: LWC component's @wire target property or method threw an error during value provisioning. Original error:
      [Cannot read properties of undefined (reading 'config')]

      Delete
    3. Not sure if you have property initialise the SDK with correct values or S3 bucket. Check console.log('AWS',AWS) what this is printed?

      Delete
    4. Problem solved. I have found problem in loading the script. Thank you for your support.
      Now new error is coming - AccessControlListNotSupported: The bucket does not allow ACLs
      can you please help me for this.

      Delete
    5. You should be able to go to the AWS S3 console and navigate to the bucket details for the bucket you try to write objects to. You'll see a tab called 'Permissions'. There you have the option to change the "Object Ownership" at a block with te same title.

      Once there, you can choose the option "ACLs enabled".

      After applying those changes, you should be able to write objects with ACL options.

      Or connect ur AWS Admin to resolve this error.

      Delete
    6. Hi Avijit,
      Thanks for the quick response. does this code has any file size limitations? Because when I am trying to upload 500 MB error timeout is coming in js. So is there anything that I am missing Please let me know.

      Delete
    7. Technically dont see a limit but check AWS SDK documents such as connection time out, file upload size limit etc.

      Delete
    8. Hi Avijit,
      Thanks for the quick response. From the above the code can we upload multiple files at a time. and if yes can you share me the link or code which I can use for uploading multiple files.

      Delete
    9. Hi Avijit
      One more question Is there any way to verify AWS Access Key, Secret Key are valid or not using salesforce in apex or AWS SDK JS without inserting any dummy file in the bucket.

      Delete
    10. Hi Avijit ,
      I am waiting for your response.

      Delete
  17. Seems multiple file upload is possible but I never explore that. You need to check SDK documents regarding multiple file upload and Access Key validation.

    ReplyDelete
    Replies
    1. Hi Avijit
      Thanks for the response, Just wanted to know can we upload large file without using AWS SDK JS means using Xmhttp request in JavaScript , if you know any reference please let me know and I am facing error - The request signature we calculated does not match the signature you provided. Check your key and signing method.------PUT
      application/pdf
      x-amz-date:Mon, 18 Jul 2022 07:28:54 GMT
      /bucketname/contactlisttestuploadsome stringstringrequesthost


      Please help me if you have any suggestions .

      Delete
  18. This comment has been removed by the author.

    ReplyDelete
  19. Hi Avijit
    We are getting this error in js when using this above code and uploading 1.5 GB please help me its very crucial for us and need to resolve this error.

    err NetworkingError: Network Failure
    at Object.eval (testlogo__aws:89:28926)
    at XMLHttpRequest.handleEvent (aura_proddebug.js:27113:18)
    at XMLHttpRequest.wrapperFn (aura_proddebug.js:505:29)

    net::ERR_CONNECTION_RESET

    ReplyDelete
  20. Hi Avijit
    Is there any way that we can validate access key, secret access key, region name, bucket name through AWS SDK or any other way that I can implement in salesforce
    Please let me know if you have solution of this problem or any documentation or any link. Please share it with me its very urgent for me to overcome this problem.

    ReplyDelete