Update documentation for dataflow testing and deployment.

Change-Id: Ic5ddf9cc9d5a262e2848f1dd774ebb70fbf1e6b7
Bug: 937519
Reviewed-on: https://chromium-review.googlesource.com/c/infra/infra/+/1519002
Reviewed-by: Andrii Shyshkalov <tandrii@chromium.org>
Commit-Queue: Erik Chen <erikchen@chromium.org>
Cr-Original-Commit-Position: refs/heads/master@{#21375}
Cr-Mirrored-From: https://chromium.googlesource.com/infra/infra
Cr-Mirrored-Commit: 41fcb2825b072ef80002cfd9635e5c783fa946f0
diff --git a/README.md b/README.md
index db7b56f..735e368 100644
--- a/README.md
+++ b/README.md
@@ -27,6 +27,13 @@
 
 [Beam Docs](https://beam.apache.org/documentation/)
 
+# Unit Testing
+
+From the root of the infra repository, run the command
+```
+./test.py test packages/dataflow
+```
+
 # Workflow Testing
 
 There are a couple requirements to testing your Dataflow workflow.
@@ -56,8 +63,8 @@
 [editor](https://pantheon.corp.google.com/iam-admin/iam) of that project to
 request access.
 
-Finally, run the command below to test your workflow. Note: Job names should
-match the regular expression [a-z]\([-a-z0-9]{0,38}[a-z0-9]).
+Finally, run the command below to test your workflow as a remote job. Note: Job
+names should match the regular expression [a-z]\([-a-z0-9]{0,38}[a-z0-9]).
 
 ```
 python <path-to-dataflow-job> --job_name <pick-a-job-name> \
@@ -72,8 +79,26 @@
 Running the test will leave behind a directory,
 `packages/dataflow/dataflow.egg-info`, that you must manually clean up.
 
+To run the workflow locally, first set credentials using
+```
+export GOOGLE_APPLICATION_CREDENTIALS=<path_to_credentials>
+```
+
+Then
+```
+python cq_attempts.py --output <dummy_path> --project <name_of_test_project>
+```
+
 # Updating the package
 
+Changes to this directory are automatically mirrored in a synthesized
+[repo](https://chromium.googlesource.com/infra/infra/packages/dataflow/). To
+deploy changes to this repository:
+* Land the changes.
+* Submit a separate CL that updates the version in `setup.py`.
+* Build and upload a new wheel.
+* Submit a single CL that updates the remote execution recipe and deps.pyl.
+
 Jobs scheduled with the
 [remote_execute_dataflow_workflow](../../recipes/recipes/remote_execute_dataflow_workflow.py)
 recipe use the version of the job at HEAD but the version of the package pinned