refactor: 重构前端代码,拆分组件并优化README文档

This commit is contained in:
dmy
2026-01-12 14:37:18 +08:00
parent 8b2f328981
commit 4f37b0fb61
9 changed files with 660 additions and 477 deletions

100
README.md
View File

@@ -97,26 +97,86 @@ Nest is an MIT-licensed open source project. It can grow thanks to the sponsors
Nest is [MIT licensed](https://github.com/nestjs/nest/blob/master/LICENSE).
How to Run:
1. Database Setup: Update the .env file with your PostgreSQL credentials.
## How to Run
1 DATABASE_TYPE=postgres
2 DATABASE_HOST=localhost
3 DATABASE_PORT=5432
4 DATABASE_USERNAME=your_username
5 DATABASE_PASSWORD=your_password
6 DATABASE_NAME=bidding
7 DATABASE_SYNCHRONIZE=true
2. Install Dependencies:
1 npm install
2 cd frontend && npm install
3. Build and Start:
### 1. Database Setup
Update the `.env` file with your PostgreSQL credentials:
1 # From the root directory
2 cd frontend && npm run build
3 cd ..
4 npm run build
5 npm run start
```env
DATABASE_TYPE=postgres
DATABASE_HOST=localhost
DATABASE_PORT=5432
DATABASE_USERNAME=your_username
DATABASE_PASSWORD=your_password
DATABASE_NAME=bidding
DATABASE_SYNCHRONIZE=true
```
The system will automatically initialize with the preset keywords: "山东", "海", "建设", "工程", "采购". You can
manage these and view crawled bidding information at http://localhost:3000.
### 2. Install Dependencies
```bash
npm install
cd frontend && npm install
```
### 3. Build and Start
```bash
# From the root directory
cd frontend && npm run build
cd ..
npm run build
npm run start
```
## Features
### Frontend Features
- **Dashboard**: View high priority bids and today's bids
- **Date Filtering**:
- Click "3天" or "7天" buttons to filter bids from the last 3 or 7 days
- The filter only limits the start date, showing all data from the selected start date onwards (including data newer than the end date)
- **Keyword Filtering**: Filter bids by keywords (saved in localStorage)
- **All Bids**: View all bids with pagination and source filtering
- **Keyword Management**: Add and delete keywords with weight-based priority
### Backend Features
- **Multi-Source Crawling**: Crawls bidding information from multiple sources:
- ChdtpCrawler
- ChngCrawler
- SzecpCrawler
- CdtCrawler
- EpsCrawler
- CnncecpCrawler
- CgnpcCrawler
- CeicCrawler
- EspicCrawler
- PowerbeijingCrawler
- **Automatic Retry**: If a crawler returns 0 items, it will be retried after all crawlers complete
- **Proxy Support**: Configurable proxy settings via environment variables
- **Scheduled Tasks**: Automatic crawling at scheduled intervals
### Environment Variables
```env
# Database
DATABASE_TYPE=postgres
DATABASE_HOST=localhost
DATABASE_PORT=5432
DATABASE_USERNAME=your_username
DATABASE_PASSWORD=your_password
DATABASE_NAME=bidding
DATABASE_SYNCHRONIZE=true
# Proxy (optional)
PROXY_HOST=your_proxy_host
PROXY_PORT=your_proxy_port
PROXY_USERNAME=your_proxy_username
PROXY_PASSWORD=your_proxy_password
```
## Initial Setup
The system will automatically initialize with the preset keywords: "山东", "海", "建设", "工程", "采购". You can manage these and view crawled bidding information at http://localhost:3000.