Merge pull request #449 from crawlab-team/develop

Develop
This commit is contained in:
Marvin Zhang
2020-01-13 12:35:45 +08:00
committed by GitHub
5 changed files with 30 additions and 9 deletions

View File

@@ -35,7 +35,7 @@ RUN apt-get update \
&& ln -s /usr/bin/python3 /usr/local/bin/python
# install backend
RUN pip install scrapy pymongo bs4 requests
RUN pip install scrapy pymongo bs4 requests crawlab-sdk
# add files
ADD . /app

View File

@@ -33,7 +33,7 @@ RUN chmod 777 /tmp \
&& ln -s /usr/bin/python3 /usr/local/bin/python
# install backend
RUN pip install scrapy pymongo bs4 requests -i https://pypi.tuna.tsinghua.edu.cn/simple
RUN pip install scrapy pymongo bs4 requests crawlab-sdk -i https://pypi.tuna.tsinghua.edu.cn/simple
# add files
ADD . /app

View File

@@ -41,8 +41,9 @@
### 要求Docker
- Docker 18.03+
- Redis
- Redis 5.x+
- MongoDB 3.6+
- Docker Compose 1.24+ (可选,但推荐)
### 要求(直接部署)
- Go 1.12+
@@ -52,12 +53,16 @@
## 快速开始
请打开命令行并执行下列命令。请保证您已经提前安装了 `docker-compose`
```bash
git clone https://github.com/crawlab-team/crawlab
cd crawlab
docker-compose up -d
```
接下来,您可以看看 `docker-compose.yml` (包含详细配置参数),以及参考 [文档](http://docs.crawlab.cn) 来查看更多信息。
## 运行
### Docker
@@ -71,13 +76,11 @@ services:
image: tikazyq/crawlab:latest
container_name: master
environment:
CRAWLAB_API_ADDRESS: "http://localhost:8000"
CRAWLAB_SERVER_MASTER: "Y"
CRAWLAB_MONGO_HOST: "mongo"
CRAWLAB_REDIS_ADDRESS: "redis"
ports:
- "8080:8080" # frontend
- "8000:8000" # backend
- "8080:8080"
depends_on:
- mongo
- redis

View File

@@ -41,8 +41,9 @@ Two methods:
### Pre-requisite (Docker)
- Docker 18.03+
- Redis
- Redis 5.x+
- MongoDB 3.6+
- Docker Compose 1.24+ (optional but recommended)
### Pre-requisite (Direct Deploy)
- Go 1.12+
@@ -52,12 +53,16 @@ Two methods:
## Quick Start
Please open the command line prompt and execute the command beloe. Make sure you have installed `docker-compose` in advance.
```bash
git clone https://github.com/crawlab-team/crawlab
cd crawlab
docker-compose up -d
```
Next, you can look into the `docker-compose.yml` (with detailed config params) and the [Documentation (Chinese)](http://docs.crawlab.cn) for further information.
## Run
### Docker
@@ -76,8 +81,7 @@ services:
CRAWLAB_MONGO_HOST: "mongo"
CRAWLAB_REDIS_ADDRESS: "redis"
ports:
- "8080:8080" # frontend
- "8000:8000" # backend
- "8080:8080"
depends_on:
- mongo
- redis

View File

@@ -119,6 +119,20 @@ func SetEnv(cmd *exec.Cmd, envs []model.Env, taskId string, dataCol string) *exe
// 默认环境变量
cmd.Env = append(os.Environ(), "CRAWLAB_TASK_ID="+taskId)
cmd.Env = append(cmd.Env, "CRAWLAB_COLLECTION="+dataCol)
cmd.Env = append(cmd.Env, "CRAWLAB_MONGO_HOST="+viper.GetString("mongo.host"))
cmd.Env = append(cmd.Env, "CRAWLAB_MONGO_PORT="+viper.GetString("mongo.port"))
if viper.GetString("mongo.db") != "" {
cmd.Env = append(cmd.Env, "CRAWLAB_MONGO_DB="+viper.GetString("mongo.db"))
}
if viper.GetString("mongo.username") != "" {
cmd.Env = append(cmd.Env, "CRAWLAB_MONGO_USERNAME="+viper.GetString("mongo.username"))
}
if viper.GetString("mongo.password") != "" {
cmd.Env = append(cmd.Env, "CRAWLAB_MONGO_PASSWORD="+viper.GetString("mongo.password"))
}
if viper.GetString("mongo.authSource") != "" {
cmd.Env = append(cmd.Env, "CRAWLAB_MONGO_AUTHSOURCE="+viper.GetString("mongo.authSource"))
}
cmd.Env = append(cmd.Env, "PYTHONUNBUFFERED=0")
cmd.Env = append(cmd.Env, "PYTHONIOENCODING=utf-8")
cmd.Env = append(cmd.Env, "TZ=Asia/Shanghai")