[dirmd] Deduplicate directory metadata

Programmatically delete redundant metadata.
This CL is entirely machine-generated:
1. Export the metadata in the "original" and "reduced" forms:
   dirmd read -form original > ~/tmp/dirmd/original.json
   dirmd read -form reduced  > ~/tmp/dirmd/reduced.json
2. Run reduce.py
   Source code: https://gist.github.com/nodirg/a4803af94ffe258ba0a6e0a4807141d8

The script diffs the two JSON files and strips redundant
lines from DIR_METADATA files. If the file became empty, then
removes the file.

The results of the script was verified by diffing canonical
representation of `dirmd read -form full` before and after
the script: there is no diff.

Bug: 1179786
Change-Id: Ie5de3ba0962d266244ffb9b07c874b3dbbb69b9c
Reviewed-on: https://chromium-review.googlesource.com/c/chromium/src/+/2795985
Reviewed-by: John Abd-El-Malek <jam@chromium.org>
Reviewed-by: Fred Mello <fredmello@chromium.org>
Owners-Override: John Abd-El-Malek <jam@chromium.org>
Commit-Queue: Nodir Turakulov <nodir@chromium.org>
Cr-Commit-Position: refs/heads/master@{#883868}
GitOrigin-RevId: 6dfb6d1accb0d5b62ac31eac2ae080f10d64ab57
1 file changed
tree: ef8c3b2fccbe77417171a2ea4a62356a6b66295a
  1. public/
  2. test/
  3. DEPS
  6. README.md

This directory contains shared constants and interfaces used for communication between the Chrome Cleanup Tool and Chrome's Safe Browsing service.