1
0
Fork 0
mirror of https://we.phorge.it/source/phorge.git synced 2024-11-27 01:02:42 +01:00

Reduce the impact of bin/storage dump

Summary:
Ref T12646.

  - Use "wb1" instead of "wb" to use level 1 gzip compression (faster, less compressy). Locally, this went about 2x faster and the output only grew 4% larger.
  - LinesOfALargeExecFuture does a lot of unnecessary string operations, and can boil down to a busy wait. The process is pretty saturated by I/O so this isn't the end of the world, but just use raw ExecFuture with FutureIterator so that we wait in `select()`.
  - Also, nice the process to +19 so we try to give other things CPU.

Test Plan:
  - Ran `bin/storage dump --compress --output ...`.
  - Saw CPU time for my local database drop from ~240s to ~90s, with a 4% larger output. Most of this was adding the `1`, but the ExecFuture thing helped a little, too.
  - I'm not sure what a great way to test `nice` in a local environment is and it's system dependent anyway, but nothing got worse / blew up.
  - Used `gzcat | head` and `gzcat | tail` on the result to sanity-check that everything was preserved.

Reviewers: chad, amckinley

Reviewed By: chad

Maniphest Tasks: T12646

Differential Revision: https://secure.phabricator.com/D17795
This commit is contained in:
epriestley 2017-04-26 11:48:44 -07:00
parent 6da73fb361
commit 85ff1d5c2d

View file

@ -138,6 +138,13 @@ final class PhabricatorStorageManagementDumpWorkflow
$command = csprintf('mysqldump %Ls', $argv);
}
// Decrease the CPU priority of this process so it doesn't contend with
// other more important things.
if (function_exists('proc_nice')) {
proc_nice(19);
}
// If we aren't writing to a file, just passthru the command.
if ($output_file === null) {
return phutil_passthru('%C', $command);
@ -148,7 +155,7 @@ final class PhabricatorStorageManagementDumpWorkflow
// a full disk). See T6996 for discussion.
if ($is_compress) {
$file = gzopen($output_file, 'wb');
$file = gzopen($output_file, 'wb1');
} else {
$file = fopen($output_file, 'wb');
}
@ -162,23 +169,35 @@ final class PhabricatorStorageManagementDumpWorkflow
$future = new ExecFuture('%C', $command);
$lines = new LinesOfALargeExecFuture($future);
try {
foreach ($lines as $line) {
$line = $line."\n";
if ($is_compress) {
$ok = gzwrite($file, $line);
} else {
$ok = fwrite($file, $line);
$iterator = id(new FutureIterator(array($future)))
->setUpdateInterval(0.100);
foreach ($iterator as $ready) {
list($stdout, $stderr) = $future->read();
$future->discardBuffers();
if (strlen($stderr)) {
fwrite(STDERR, $stderr);
}
if ($ok !== strlen($line)) {
throw new Exception(
pht(
'Failed to write %d byte(s) to file "%s".',
new PhutilNumber(strlen($line)),
$output_file));
if (strlen($stdout)) {
if ($is_compress) {
$ok = gzwrite($file, $stdout);
} else {
$ok = fwrite($file, $stdout);
}
if ($ok !== strlen($stdout)) {
throw new Exception(
pht(
'Failed to write %d byte(s) to file "%s".',
new PhutilNumber(strlen($stdout)),
$output_file));
}
}
if ($ready !== null) {
$ready->resolvex();
}
}