commit | 20d8de8d512a5ba47997cb82fa7b028bc2221ae1 | [log] [tgz] |
---|---|---|
author | Alisson Gusatti Azzolini <azzolini@fb.com> | Tue May 09 12:57:20 2017 -0700 |
committer | Facebook Github Bot <facebook-github-bot@users.noreply.github.com> | Tue May 09 13:02:24 2017 -0700 |
tree | 79b3159f396c1b9ca6dde6b228d7a6fd3b5b75fc | |
parent | bd8ed6641cae903068ef458cc652c22cbebb81cf [diff] |
Parameter cost estimation job Summary: Adds a parameter cost estimation step before the actual training starts. The costs are later used in order to better shard the parameters across instances of the parameter server. Things I needed to modify: - A few changes to make ModelLayerHelper picklable - Add support for stopping a distributed job after a number of stats reporting steps. - Refactored run_dist_job to support collocating the reader with the trainer even when PS are present. - Option to disable dense updates (when num_dense_servers=0). Currently there's a huge overhead posed by having to launch a child workflow. I'll try and address next in a subsequent diff. This is WIP because the other workflows need to be migrated as well. I can break this down into smaller diffs if reviewers would prefer it. Reviewed By: kennyhorror Differential Revision: D4974752 fbshipit-source-id: 04c336acb2945f8f11324a221ffc6967818c0672
Caffe2 is a lightweight, modular, and scalable deep learning framework. Building on the original Caffe, Caffe2 is designed with expression, speed, and modularity in mind.
Please use Github issues (https://github.com/caffe2/caffe2/issues) to ask questions, report bugs, and request new features.
Please participate in our survey (https://www.surveymonkey.com/r/caffe2). We will send you information about new releases and special developer events/webinars.
Caffe2 is released under the BSD 2-Clause license.
Detailed build matrix (hit refresh if you see icons not showing up due to heroku):
Target | Status |
---|---|
Linux | |
Mac (CPU) | |
Android | |
iOS | |
Linux + MKL | |
Windows |