@mf-app/utility-module v1.3.1
React Micro Frontend Utility Template
Getting Started
- Run the script to initialize the project and install dependencies:
./setup.sh
Run
yarn start --port ${YOUR_PORT}
to run locallyAdd your new micro frontend at the root config module inside root
index.ejs
or use Import Map Deployer
<script type="systemjs-importmap">
{
"imports": {
"react": "https://cdn.jsdelivr.net/npm/react@16.13.0/umd/react.production.min.js",
"react-dom": "https://cdn.jsdelivr.net/npm/react-dom@16.13.0/umd/react-dom.production.min.js",
"single-spa": "https://cdn.jsdelivr.net/npm/single-spa@5.3.0/lib/system/single-spa.min.js",
"@${PROJECT_NAME}/root-config": "//localhost:9000/${PROJECT_NAME}-root-config.js",
"@${PROJECT_NAME}/{UTILITY_MODULE_NAME}": "//localhost:${YOUR_PORT}/${PROJECT_NAME}-{UTILITY_MODULE_NAME}.js"
}
}
</script>
- Register your utility module as external at
webpack.config.js
for each micro frontend
const { merge } = require('webpack-merge');
const singleSpaDefaults = require('webpack-config-single-spa-react-ts');
module.exports = (webpackConfigEnv, argv) => {
const defaultConfig = singleSpaDefaults({
orgName: '${PROJECT_NAME}',
projectName: '${MICRO_FRONTEND_NAME}',
webpackConfigEnv,
argv,
});
return merge(defaultConfig, {
// change the placeholders
externals: ['${PROJECT_NAME}/{UTILITY_MODULE_NAME}'],
});
};
- Import your utilities at the micro frontend and use them
import { utilityName } from '@${PROJECT_NAME}/{UTILITY_MODULE_NAME}';
- If you are using TypeScript at your micro frontend, it's recommended to use NPM when running
./setup.sh
and then runyarn add @${PROJECT_NAME}/{UTILITY_MODULE_NAME}
This way, TypeScript will infer your types and code. Also, Jest won't fail when testing and not detecting a valid import of the utility
Run
yarn start
to run your root config moduleSet
devtools
local storage key at browser console, whether your root module is running locally or it's using prod or dev environment.
localStorage.setItem('devtools', true);
- This will use import-map-overrides extension. This way, you can point the import map to your micro frontend that is running locally. Extension docs here here
Secrets
Setup secrets for S3 bucket names and roles to deploy to AWS at GitHub actions files. Secrets needed are:
ACTIONS_DEPLOY_ACCESS_TOKEN
: GitHub token used by Semantic ReleaseFRONTEND_DEPLOYMENT_ROLE
: IAM Role ARNBUCKET_NAME
: S3 Bucket nameMICRO_FRONTEND_NAME
: Micro frontend name. This will be used to create a folder where you will have your micro frontend deployed JS filesNPM_TOKEN
: secret at your repository with an NPM Automation Access Token so that your utility can be deployed to NPMIMD_USERNAME
: Username to authenticate in case you are using import map deployerIMD_PASSWORD
: Password to authenticate in case you are using import map deployerIMD_HOST
: Import map deployer domain name (without https)IMD_ENVIRONMENT
: Import map deployer environment that you want to update (prod, dev, staging)CLOUDFRONT_HOST
: Cloud front domain name (without https). This can also be Route 53, or S3 bucket domain in case you are not using CloudFront to host your import map JSON file.
This secrets should contain production values. Then, you can override secrets using Environment Secrets
Environments
Create
Development
andProduction
environments and set each one to deploy fromdev
andmaster
branches (Selected Branches rule)Each environment should have its own S3 Bucket, IAM Role for deployment and CloudFront distribution
Setup environment secrets at
Development
so that the developmentFRONTEND_DEPLOYMENT_ROLE
points to a role that will interact with the development S3BUCKET_NAME
Override
IMD_ENVIRONMENT
at Development Env Secret so that it points to dev, test or whatever name you gave to this env in your import map deployer serverChange
environment-url
input passed down to deployment workflow so that each env will point to the corresponding CloudFront or Route 53 urlSet
run-import-map-deployer
to true if you already stored the required import map secrets
Important notes
Maintain consistency for the project name (all micro service and root project should have the same project name)
It's recommended to use the root config module template from this template to be consistent with project naming convention
Import Map Deployer
- It's highly recommended to use Import Map Deployer so that this root repo will get the micro frontend imports from a dynamic import map JSON file. If you don't want to use it, remove the following lines at
.github/workflows/build_and_deploy.yml
- name: Update import map
run: curl -u ${USERNAME}:${PASSWORD} -d '{ "service":"@{YOUR_ORGANZATION_NAME}/'"${MICRO_FRONTEND_NAME}"'","url":"https://'"${CLOUDFRONT_HOST}"'/'"${MICRO_FRONTEND_NAME}"'/'"${IDENTIFIER}"'/'{YOUR_ORGANZATION_NAME}-"${MICRO_FRONTEND_NAME}"'.js" }' -X PATCH https://${IMD_HOST}/services/\?env=prod -H "Accept:application/json" -H "Content-Type:application/json"
env:
USERNAME: ${{ secrets.IMD_USERNAME }}
PASSWORD: ${{ secrets.IMD_PASSWORD }}
MICRO_FRONTEND_NAME: ${{ secrets.MICRO_FRONTEND_NAME }}
CLOUDFRONT_HOST: ${{ secrets.CLOUDFRONT_HOST }}
IMD_HOST: ${{ secrets.IMD_HOST }}
IDENTIFIER: ${{ github.sha }}
It will send a patch request to your import map deployer server located at
${IMD_HOST}
domain name, at/services
endpoint.- It sends a JSON body with the service that it want to update and the url key value pair containing the new utility module url.
- It also sends the import map username and password in order to authenticate with the server
If you are not using Import Map Deployer, add your compiled JS utility code at the root module import maps
<% if (isLocal) { %> <script type="systemjs-importmap"> { "imports": { "@${PROJECT_NAME}/root-config": "//localhost:9000/${PROJECT_NAME}-root-config.js", "@${PROJECT_NAME}/{UTILITY_MODULE_NAME}": "//localhost:${YOUR_PORT}/${PROJECT_NAME}-{UTILITY_MODULE_NAME}.js" } } </script> <% } else { %> <script type="systemjs-importmap"> { "imports": { "@${PROJECT_NAME}/root-config": "https://{S3_BUCKET_NAME}.s3.amazonaws.com/${PROJECT_NAME}-root-config.js", "@${PROJECT_NAME}/{UTILITY_MODULE_NAME}": "https://{S3_BUCKET_NAME}.s3.amazonaws.com/${PROJECT_NAME}-{UTILITY_MODULE_NAME}.js" } } </script> <% } %>
Semantic Release
Set
ACTIONS_DEPLOY_ACCESS_TOKEN
secret at your repository with a GitHub Personal Access Token so that Semantic Release can work properly- This token should have full control of private repositories
Deploying Package to NPM
- Set NPM_TOKEN secret at your repository with an NPM Automation Access Token so that your utility can be deployed to NPM
It's highly recommended to publish your package to NPM so that you don't have TypeScript errors about not finding your utility module when it is imported at your micro frontends.
- Your ${PROJECT_NAME} prompted when running ´./setup.sh´ should be the same as your organization or username from NPM. This way, you will avoid errors when executing your GitHub actions pipeline at ´npm publish --access=public´ step
You can remove ´--access=public´ option from ´npm publish´ if you can publish private packages to NPM
- You can remove the ´.github´ folder if you don't want to use CI / CD GitHub actions for semantic release, publish to NPM, automated testing and deployment.
Deployment in AWS
Build the project with
yarn build
and deploy the files to a CDN (CloudFront + S3) or host to serve those static files.According with
.github/workflows/main.yml
, the action will assume a role through GitHub OIDC and AWS STS. This role has permissions to put new objects in your S3 bucket- This action step will send the build files generated at
dist
folder tos3://${BUCKET_NAME}/${MICRO_FRONTEND_NAME}/${IDENTIFIER}
- That way, it will store your utility compiled code at the same folder
${MICRO_FRONTEND_NAME}
and store each new version with GitHub Commit SHA${IDENTIFIER}
- This action step will send the build files generated at
Import map deployer step then will update
import-map.json
file in your S3 bucket with the new compiled file routeAll the instructions to deploy the whole infrastructure to AWS are at Micro Frontend Root Documentation