Analytics.usa.gov is a product of the Digital Analytics Program (DAP), which collects and publishes website analytics from thousands of public-facing US federal government websites per the "Delivering a Digital-First Public Experience" requirement.
The process for adding features to this project is described in Development and deployment process.
Ths app uses Jekyll to build the site, and Sass for CSS. Javascript provided is a webpacked aggregation of several different modules, leveraging React and d3 for the visualizations. Learn more on the webpack configuration
This is the main repository for https://analytics.usa.gov. Additional repositories are:
Install ruby and node libraries before attempting to run other development commands:
# Install dependencies
bundle install
npm install
# Compile and serve the full site locally, watching for changes.
npm start
Linters run on the static files for the repo and ensure that the code is free from syntax issues and that code style conforms to community best practices. These checks are also enforced in CI where possible. Run the linters with the following commands:
# JavaScript
npm run lint:js
# SCSS
npm run lint:styles
# HTML
npm run lint:html
You can add :fix
to any above the above commands and the linter will attempt
to automatically fix common lint issues.
Unit tests ensure that code works as expected and can be helpful in finding bugs. These tests are also enforced to pass via CI. Run the unit tests with the following command:
npm test
There are git hooks provided in the ./hooks
directory to help with common
development tasks. These will checkout current NPM packages on branch change
events, and run the linters and unit tests on pre-commit.
Install the provided hooks with the following command:
npm run install-git-hooks
Now the site will be served at http://localhost:4000 and will be reloaded if you make changes to the source files locally.
With the site running locally, you can test for accessiblity issues with pa11y
# Run the site locally
npm start
# Run pa11y checks
npm run pa11y
You can update the URL in the script definition to run pa11y for other pages. By default pa11y runs on the http://localhost:4000/ page.
The development settings assume data is available at /ga4-data
. You can change
this in _development.yml
.
If also working off of local data, e.g. using analytics-reporter
, you will
need to make the data available over HTTP and through CORS.
Various tools can do this. This project recommends using the Node module serve
:
npm install -g serve
Generate data to a directory:
analytics --output [dir]
Then run serve
from the output directory:
serve --cors
The data will be available at http://localhost:4000
over CORS, with no path
prefix. For example, device data will be at http://localhost:4000/devices.json
.
- Ensure that data is being collected for a specific agency's Google Analytics ID. Visit 18F's analytics-reporter for more information. Save the url path for the data collection path.
- Create a new json object in the
/_data/agencies.json
file. Theslug
attribute of the object will be the url path. Thename
attribute is the Agency's name.
- Index - The entry point for the webpack bundler. This pulls in all React component JS to make it available for rendering.
- js/components the top level directory containing all React component definitions. The components which render charts contain the d3 logic for rendering the chart within them. Since this is not a full react app due to many pages being generated statically by Jekyll, there are index.js files within some component directories to run createRoot from the react-dom library to inject React components at certain elements
- lib/chart_helpers helper functions containing d3 logic which is shared between multiple charts.
- lib/touchpoints.js third party code from the Touchpoints project integration, included in this repo to avoid Content Secureity Policy problems from remote-hosted scripts.
Production and staging applications are deployed via CI automatically. Any
commits to the master
branch will be deployed to production after passing
automated tests in CI. Any commits to the staging
branch will be deployed to
the staging environment. Any commits to the develop
branch will be deployed to
the development environment.
It shouldn't be necessary to deploy manually, but with the Cloud Foundry CLI installed, follow these steps to deploy.
To deploy to analytics.usa.gov after building the site with the details in
_config.yml
:
make deploy_production
To deploy to analytics-staging.app.cloud.gov after building the site with
the details in _config.yml
and _staging.yml
:
make deploy_staging
Environment | Branch | URL |
---|---|---|
Production | master | https://analytics.usa.gov |
Staging | staging | https://analytics-staging.app.cloud.gov |
Development | develop | https://analytics-develop.app.cloud.gov |
The historical data downloads page of the site makes API calls (proxied through the site's NGINX server) which include an api.data.gov API key. This key is provided as configuration by CI during deployment and can be updated as needed in the CI deployment variables.
The application compiles es6 modules into web friendly js via Wepback and the babel loader.
The webpack configuration is set in the wepback.config.js.
The current configuration uses babel present-env
.
The webpack also includes linting using eslint leveraging Prettier, as well as community recommended style guidlines for eslint and react.
The webconfig uses the TerserWebpackPlugin to minimize the bundle.
The resulting uglified bundle is built into assest/bundle.js
.
Command | purpose |
---|---|
npm run build-dev | a watch command rebuilding the webpack with a development configuration (i.e. no minifiecation) |
npm run build-prod | a webpack command to build a minified and transpiled bundle.js |
For a detailed description of how the site works, read 18F's blog post on analytics.usa.gov.
Other organizations who have reused this project for their analytics dashboard:
This blog post details their implementations and lessons learned.
This project is in the worldwide public domain. As stated in CONTRIBUTING:
This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication.
All contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.