dmy 3d269ce9d1 feat: 重构AI推荐功能并优化爬虫基础URL
重构前端AI推荐组件,移除本地过滤逻辑,改为从后端获取日期范围内的数据
新增AI服务模块,包含Prompt和推荐逻辑
为投标服务添加按日期范围查询接口
统一各爬虫服务的baseURL格式
2026-01-12 18:59:17 +08:00
2026-01-12 18:36:08 +08:00
2026-01-12 18:36:08 +08:00
2026-01-09 23:18:52 +08:00
2026-01-09 23:18:52 +08:00
2026-01-09 23:18:52 +08:00
2026-01-09 23:18:52 +08:00
2026-01-12 18:36:08 +08:00
2026-01-09 23:18:52 +08:00
2026-01-09 23:18:52 +08:00
2026-01-09 23:18:52 +08:00

Nest Logo

A progressive Node.js framework for building efficient and scalable server-side applications.

NPM Version Package License NPM Downloads CircleCI Discord Backers on Open Collective Sponsors on Open Collective Donate us Support us Follow us on Twitter

Description

Nest framework TypeScript starter repository.

Project setup

$ npm install

Compile and run the project

# development
$ npm run start

# watch mode
$ npm run start:dev

# production mode
$ npm run start:prod

Run tests

# unit tests
$ npm run test

# e2e tests
$ npm run test:e2e

# test coverage
$ npm run test:cov

Deployment

When you're ready to deploy your NestJS application to production, there are some key steps you can take to ensure it runs as efficiently as possible. Check out the deployment documentation for more information.

If you are looking for a cloud-based platform to deploy your NestJS application, check out Mau, our official platform for deploying NestJS applications on AWS. Mau makes deployment straightforward and fast, requiring just a few simple steps:

$ npm install -g @nestjs/mau
$ mau deploy

With Mau, you can deploy your application in just a few clicks, allowing you to focus on building features rather than managing infrastructure.

Resources

Check out a few resources that may come in handy when working with NestJS:

  • Visit the NestJS Documentation to learn more about the framework.
  • For questions and support, please visit our Discord channel.
  • To dive deeper and get more hands-on experience, check out our official video courses.
  • Deploy your application to AWS with the help of NestJS Mau in just a few clicks.
  • Visualize your application graph and interact with the NestJS application in real-time using NestJS Devtools.
  • Need help with your project (part-time to full-time)? Check out our official enterprise support.
  • To stay in the loop and get updates, follow us on X and LinkedIn.
  • Looking for a job, or have a job to offer? Check out our official Jobs board.

Support

Nest is an MIT-licensed open source project. It can grow thanks to the sponsors and support by the amazing backers. If you'd like to join them, please read more here.

Stay in touch

License

Nest is MIT licensed.

How to Run

1. Database Setup

Update the .env file with your PostgreSQL credentials:

DATABASE_TYPE=postgres
DATABASE_HOST=localhost
DATABASE_PORT=5432
DATABASE_USERNAME=your_username
DATABASE_PASSWORD=your_password
DATABASE_NAME=bidding
DATABASE_SYNCHRONIZE=true

2. Install Dependencies

npm install
cd frontend && npm install

3. Build and Start

# From the root directory
cd frontend && npm run build
cd ..
npm run build
npm run start

Features

Frontend Features

  • Dashboard: View high priority bids and today's bids
  • Date Filtering:
  • Click "3天" or "7天" buttons to filter bids from the last 3 or 7 days
  • The filter only limits the start date, showing all data from the selected start date onwards (including data newer than the end date)
  • Keyword Filtering: Filter bids by keywords (saved in localStorage)
  • All Bids: View all bids with pagination and source filtering
  • Keyword Management: Add and delete keywords with weight-based priority

Backend Features

  • Multi-Source Crawling: Crawls bidding information from multiple sources:
  • ChdtpCrawler
  • ChngCrawler
  • SzecpCrawler
  • CdtCrawler
  • EpsCrawler
  • CnncecpCrawler
  • CgnpcCrawler
  • CeicCrawler
  • EspicCrawler
  • PowerbeijingCrawler
  • Automatic Retry: If a crawler returns 0 items, it will be retried after all crawlers complete
  • Proxy Support: Configurable proxy settings via environment variables
  • Scheduled Tasks: Automatic crawling at scheduled intervals

Environment Variables

# Database
DATABASE_TYPE=postgres
DATABASE_HOST=localhost
DATABASE_PORT=5432
DATABASE_USERNAME=your_username
DATABASE_PASSWORD=your_password
DATABASE_NAME=bidding
DATABASE_SYNCHRONIZE=true

# Proxy (optional)
PROXY_HOST=your_proxy_host
PROXY_PORT=your_proxy_port
PROXY_USERNAME=your_proxy_username
PROXY_PASSWORD=your_proxy_password

Initial Setup

The system will automatically initialize with the preset keywords: "山东", "海", "建设", "工程", "采购". You can manage these and view crawled bidding information at http://localhost:3000.

Description
No description provided
Readme 1.1 MiB
Languages
TypeScript 63.1%
Vue 26.8%
JavaScript 3.8%
Go 3.5%
CSS 2%
Other 0.8%