Skip to content

Parse hosted files migration tools #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Jul 19, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,5 @@ node_modules

# Optional REPL history
.node_repl_history

.DS_Store
Copy link
Member

@acinader acinader Jul 19, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my $.02 is that this should go in ~/.gitignore_global (ymmv)

and there should be a cr at the end of this file.

51 changes: 41 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,51 @@
# parse-files-utils
Utilities to list and migrate Parse files
Utilities to list and migrate Parse files.

This utility will print in the terminal all the files URL's from the parse server
This utility will do the following:

This can be really useful when you migrate your files and want to move the files from the Parse S3 host to you own.
1. Get all files across all classess in a Parse database.
2. Print file URLs to console OR transfer to S3, GCS, or filesystem.
3. Rename files so that [Parse Server](https://github.com/ParsePlatform/parse-server) no longer detects that they are hosted by Parse.
4. Update MongoDB with new file names.

This utility won't save the files anywhere else. You can save the results to a file or pipe the results to another program:
#### \*WARNING\*
As soon as this script transfers files away from Parse.com hosted files (and renames them in the database)
any clients that use api.parse.com will no longer be able to access the files.
See the section titled "5. Files" in the [Parse Migration Guide](https://parse.com/migration)
and Parse Server [issue #1582](https://github.com/ParsePlatform/parse-server/issues/1582).

## usage
## Installation

1. Clone the repo: `git clone [email protected]:parse-server-modules/parse-files-utils.git`
2. cd into repo: `cd parse-file-utils`
3. Install dependencies: `npm install`

## Usage

The quickest way to get started is to run `npm start` and follow the command prompts.

You can optionally specify a js/json configuration file (see [config.example.js](./config.example.js)).
```
$ node index.js MY_APP_ID MY_MASTER_KEY
$ npm start config.js
```

you can optionally specify a server URL
### Available configuration options

```
$ node index.js MY_APP_ID MY_MASTER_KEY MY_SERVER_URL
```
* `applicationId`: Parse application id.
* `masterKey`: Parse master key.
* `mongoURL`: MongoDB connection url.
* `serverURL`: The URL for the Parse server (default: http://api.parse.com/1).
* `filesToTransfer`: Which files to transfer. Accepted options: `parseOnly`, `parseServerOnly`, `all`.
* `renameInDatabase` (boolean): Whether or not to rename files in MongoDB.
* `filesAdapter`: A Parse Server file adapter with a function for `createFile(filename, data)`
(ie. [parse-server-fs-adapter](https://github.com/parse-server-modules/parse-server-fs-adapter),
[parse-server-s3-adapter](https://github.com/parse-server-modules/parse-server-s3-adapter),
[parse-server-gcs-adapter](https://github.com/parse-server-modules/parse-server-gcs-adapter)).
* `filesystemPath`: The path/directory to save files to when transfering to filesystem.
* `aws_accessKeyId`: AWS access key id.
* `aws_secretAccessKey`: AWS secret access key.
* `aws_bucket`: S3 bucket name.
* `gcs_projectId`: GCS project id.
* `gcs_keyFilename`: GCS key filename (ie. `credentials.json`).
* `gcs_bucket`: GCS bucket name.
* `asyncLimit`: The number of files to process at the same time (default: 5).
40 changes: 40 additions & 0 deletions config.example.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
var FileAdapter = require('parse-server-fs-adapter');
var S3Adapter = require('parse-server-s3-adapter');
var GCSAdapter = require('parse-server-gcs-adapter');

module.exports = {
applicationId: "PARSE_APPLICATION_ID",
masterKey: "PARSE_MASTER_KEY",
mongoURL: "mongodb://<username>:<password>@mongourl.com:27017/database_name",
serverURL: "https://api.customparseserver.com/parse",
filesToTransfer: 'parseOnly',
renameInDatabase: false,

// For filesystem configuration
filesystemPath: './downloaded_files',

// For S3 configuration
aws_accessKeyId: "ACCESS_KEY_ID",
aws_secretAccessKey: "SECRET_ACCESS_KEY",
aws_bucket: "BUCKET_NAME",

// For GCS configuration
gcs_projectId: "GCS_PROJECT_ID",
gcs_keyFilename: "credentials.json",
gcs_bucket: "BUCKET_NAME",

// Or set filesAdapter to a Parse Server file adapter
// filesAdapter: new FileAdapter({
// filesSubDirectory: './downloaded_files'
// }),
// filesAdapter: new S3Adapter({
// accessKey: 'ACCESS_KEY_ID',
// secretKey: 'SECRET_ACCESS_KEY',
// bucket: 'BUCKET_NAME'
// }),
// filesAdapter: new GCSAdapter({
// projectId: "GCS_PROJECT_ID",
// keyFilename: "credentials.json",
// bucket: "BUCKET_NAME",
// }),
};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cr at end of file.

20 changes: 13 additions & 7 deletions index.js
Original file line number Diff line number Diff line change
@@ -1,10 +1,16 @@
var appID = process.argv[2];
var masterKey = process.argv[3];
var serverURL = process.argv[4];
var path = require('path');
var configFilePath = process.argv[2];
var config = {};

if (!appID || !masterKey) {
process.stderr.write('An appId and a masterKey are required\n');
process.exit(1);
if (configFilePath) {
configFilePath = path.resolve(configFilePath);

try {
config = require(configFilePath);
} catch(e) {
console.log('Cannot load '+configFilePath);
process.exit(1);
}
}

var utils = require('./lib')(appID, masterKey, serverURL);
var utils = require('./lib')(config);
107 changes: 74 additions & 33 deletions lib/index.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,56 @@
var Parse = require('parse/node');
var inquirer = require('inquirer');

var schemas = require('./schemas');
var transfer = require('./transfer');
var questions = require('./questions.js');

module.exports = initialize;

function initialize(config) {
questions(config).then(function (answers) {
config = Object.assign(config, answers);
console.log(JSON.stringify(config, null, 2));
return inquirer.prompt({
type: 'confirm',
name: 'next',
message: 'About to start the file transfer. Does the above look correct?',
default: true,
});
}).then(function(answers) {
if (!answers.next) {
console.log('Aborted!');
process.exit();
}
Parse.initialize(config.applicationId, null, config.masterKey);
Parse.serverURL = config.serverURL;
return transfer.init(config);
}).then(function() {
return getAllFileObjects();
}).then(function(objects) {
return transfer.run(objects);
}).then(function() {
console.log('Complete!');
process.exit();
}).catch(function(error) {
console.log(error);
process.exit(1);
});
}

function getAllFileObjects() {
console.log("Fetching schema...");
return schemas.get().then(function(res){
console.log("Fetching all objects with files...");
var schemasWithFiles = onlyFiles(res);
return Promise.all(schemasWithFiles.map(getObjectsWithFilesFromSchema));
}).then(function(results) {
var files = results.reduce(function(c, r) {
return c.concat(r);
}, []);
return Promise.resolve(files);
});
}

function onlyFiles(schemas) {
return schemas.map(function(schema) {
Expand All @@ -18,53 +69,43 @@ function onlyFiles(schemas) {

function getAllObjects(baseQuery) {
var allObjects = [];
var next = function(startIndex) {
baseQuery.skip(startIndex);
var next = function() {
if (allObjects.length) {
baseQuery.greaterThan('createdAt', allObjects[allObjects.length-1].createdAt);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should order on createdAt and force the greaterThan no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do 😃 The query is created in getObjectsWithFilesFromSchema with query.ascending('createdAt') on line 93.

Copy link
Contributor

@flovilmart flovilmart Jul 19, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome!

}
return baseQuery.find({useMasterKey: true}).then(function(r){
allObjects = allObjects.concat(r);
if (r.length == 0) {
return Promise.resolve(allObjects);
} else {
return next(startIndex+r.length);
return next();
}
});
}
return next(0);
return next();
}

function getFilesFromSchema(schema) {
function getObjectsWithFilesFromSchema(schema) {
var query = new Parse.Query(schema.className);
query.select(schema.fields);
query.select(schema.fields.concat('createdAt'));
query.ascending('createdAt');
query.limit(1000);
schema.fields.forEach(function(field) {
query.exists(field);
})
});
return getAllObjects(query).then(function(results) {
return results.reduce(function(current, result){
return current.concat(schema.fields.map(function(field){
return result.get(field).url();
}))
return current.concat(
schema.fields.map(function(field){
return {
className: schema.className,
objectId: result.id,
fieldName: field,
fileName: result.get(field).name(),
url: result.get(field).url()
}
})
);
}, []);
});
}

module.exports = function(applicationId, masterKey, serverURL) {
Parse.initialize(applicationId, null, masterKey);
Parse.serverURL = serverURL || "https://api.parse.com/1";
schemas.get().then(function(res){
var schemasWithFiles = onlyFiles(res);
return Promise.all(schemasWithFiles.map(getFilesFromSchema));
}).then(function(results) {
var files = results.reduce(function(c, r) {
return c.concat(r);
}, []);
files.forEach(function(file) {
process.stdout.write(file);
process.stdout.write("\n");
});
process.exit(0);
}).catch(function(err){
process.stderr.write(err);
process.stderr.write("\n");
process.exit(1);
})
}
}
Loading