Sheriffing/TBPL/DeveloperDocs: Difference between revisions

Remove content that is out of date
(Update contacts etc)
(Remove content that is out of date)
Line 36: Line 36:


(I'm going to start a design doc right here, though it should probably be broken out to a different wiki page at some point.)
(I'm going to start a design doc right here, though it should probably be broken out to a different wiki page at some point.)
Note: TBPL is in a transitional period. We're trying to get rid of the dependency on Tinderbox and replace the missing pieces with data from Buildbot and reimplemented functionality on the TBPL server. That's why TBPL can currently operate in two modes: "Tinderbox mode" and "Buildbot mode". Tinderbox mode is the default at the moment, and adding "&usebuildbot=1" to the URL will activate Buildbot mode.


TBPL is mostly about useful presentation of data that's already available elsewhere. For example, the list of pushes to a repository is at hg.mozilla.org/repository/pushloghtml ([http://hg.mozilla.org/mozilla-central/pushloghtml e.g. mozilla-central pushlog]). Another example is the [https://build.mozilla.org/buildapi/self-serve self-serve UI] that allows one to cancel and restart builds. So it's TBPL's job to reassemble that data in a way that's as accessible for developers as possible.
TBPL is mostly about useful presentation of data that's already available elsewhere. For example, the list of pushes to a repository is at hg.mozilla.org/repository/pushloghtml ([http://hg.mozilla.org/mozilla-central/pushloghtml e.g. mozilla-central pushlog]). Another example is the [https://build.mozilla.org/buildapi/self-serve self-serve UI] that allows one to cancel and restart builds. So it's TBPL's job to reassemble that data in a way that's as accessible for developers as possible.


For some of that source data, TBPL is the only human-readable presentation. Run result data, for example, is only available as JSON (from Buildbot) or in a very confusing waterfall form (on Tinderbox).
For some of that source data, TBPL is the only human-readable presentation. Run result data, for example, is only available as JSON (from Buildbot).


With the switch to Buildbot mode there will be two exceptions to the "data comes from outside" principle: Job comments (also called "build stars") and the list of hidden builders will be stored on the TBPL server itself, in a MongoDB database. In Tinderbox mode both these things were stored on Tinderbox.
There are two exceptions to the "data comes from outside" principle: Job comments (also called "build stars") and the list of hidden builders will be stored on the TBPL server itself, in a MongoDB database. In Tinderbox mode both these things were stored on Tinderbox.


=== Architecture ===
=== Architecture ===
Line 59: Line 57:


(to be continued...)
(to be continued...)
= Logical diagram  =
Can I see an example of such a diagram? I'm unclear what's requested here.
= Physical diagram  =
not fixed yet


= Hardware  =
= Hardware  =
Line 79: Line 69:


Any form of unix that's supported by Apache, PHP, Python and MongoDB should work. IT will decide what will actually be used.
Any form of unix that's supported by Apache, PHP, Python and MongoDB should work. IT will decide what will actually be used.
= Interface settings and IP allocations  =
This section needs to be filled in by IT.
=== VLANs  ===
=== Private interfaces  ===
=== Public interfaces  ===


= Network flows  =
= Network flows  =
Line 128: Line 108:
Stage: tbpl_allizom_org
Stage: tbpl_allizom_org
Dev: tbpl_dev_allizom_org
Dev: tbpl_dev_allizom_org
Rewriting TBPL to use elastic search instead would only take a few hours but it's unclear whether it's worth doing.


= File storage  =
= File storage  =
Line 147: Line 125:
There needs to be a cron job that periodically runs tbpl/dataimport/import-buildbot-data.py in order to import buildbot data into the MongoDB. The import frequency hasn't been fixed yet, but it's probably going to be between one and 5 minutes. (The Buildbot source data is regenerated every minute.)
There needs to be a cron job that periodically runs tbpl/dataimport/import-buildbot-data.py in order to import buildbot data into the MongoDB. The import frequency hasn't been fixed yet, but it's probably going to be between one and 5 minutes. (The Buildbot source data is regenerated every minute.)
The importer is idempotent; it never destroys data and it doesn't insert duplicates.
The importer is idempotent; it never destroys data and it doesn't insert duplicates.
=== Puppet  ===
(what modules/classes will be used?)
(what's "Puppet"?)
= Monitoring  =
=== Nagios  ===
(what checks will need set for this service?)
= Ganglia Update / Push procedure  =
(how to update the code, db, etc)
= Common Troubleshooting  =
= Backup / DR  =
(where are backups stored if any?
How can someone else fix this site in a disaster?)
THIS IS ALL OUT OF DATE
I don't know. Only the MongoDB would need to be backed up, but how would that be done?
Most of TBPL's data comes from other sources and can be reassembled by simply running the importer (dataimport/import-buildbot-data.py) again. The two exceptions are job comments (also called "build stars") and the hidden builder list, which are directly stored in the TBPL MongoDB and don't come from other sources.
= Staging site  =
= Deployment Bugs  =
= Admin Contacts  =
Primary Admin:
Secondary Admin:


= Developer Contacts =
= Developer Contacts =
canmove, Confirmed users
1,126

edits