A progressive Node.js framework for building efficient and scalable server-side applications.
## Description [Nest](https://github.com/nestjs/nest) framework TypeScript starter repository. ## Project setup ```bash $ npm install ``` ## Compile and run the project ```bash # development $ npm run start # watch mode $ npm run start:dev # production mode $ npm run start:prod ``` ## Run tests ```bash # unit tests $ npm run test # e2e tests $ npm run test:e2e # test coverage $ npm run test:cov ``` ## Deployment When you're ready to deploy your NestJS application to production, there are some key steps you can take to ensure it runs as efficiently as possible. Check out the [deployment documentation](https://docs.nestjs.com/deployment) for more information. If you are looking for a cloud-based platform to deploy your NestJS application, check out [Mau](https://mau.nestjs.com), our official platform for deploying NestJS applications on AWS. Mau makes deployment straightforward and fast, requiring just a few simple steps: ```bash $ npm install -g @nestjs/mau $ mau deploy ``` With Mau, you can deploy your application in just a few clicks, allowing you to focus on building features rather than managing infrastructure. ## Resources Check out a few resources that may come in handy when working with NestJS: - Visit the [NestJS Documentation](https://docs.nestjs.com) to learn more about the framework. - For questions and support, please visit our [Discord channel](https://discord.gg/G7Qnnhy). - To dive deeper and get more hands-on experience, check out our official video [courses](https://courses.nestjs.com/). - Deploy your application to AWS with the help of [NestJS Mau](https://mau.nestjs.com) in just a few clicks. - Visualize your application graph and interact with the NestJS application in real-time using [NestJS Devtools](https://devtools.nestjs.com). - Need help with your project (part-time to full-time)? Check out our official [enterprise support](https://enterprise.nestjs.com). - To stay in the loop and get updates, follow us on [X](https://x.com/nestframework) and [LinkedIn](https://linkedin.com/company/nestjs). - Looking for a job, or have a job to offer? Check out our official [Jobs board](https://jobs.nestjs.com). ## Support Nest is an MIT-licensed open source project. It can grow thanks to the sponsors and support by the amazing backers. If you'd like to join them, please [read more here](https://docs.nestjs.com/support). ## Stay in touch - Author - [Kamil Myśliwiec](https://twitter.com/kammysliwiec) - Website - [https://nestjs.com](https://nestjs.com/) - Twitter - [@nestframework](https://twitter.com/nestframework) ## License Nest is [MIT licensed](https://github.com/nestjs/nest/blob/master/LICENSE). ## How to Run ### 1. Database Setup Update the `.env` file with your PostgreSQL credentials: ```env DATABASE_TYPE=postgres DATABASE_HOST=localhost DATABASE_PORT=5432 DATABASE_USERNAME=your_username DATABASE_PASSWORD=your_password DATABASE_NAME=bidding DATABASE_SYNCHRONIZE=true ``` ### 2. Install Dependencies ```bash npm install cd frontend && npm install ``` ### 3. Build and Start ```bash # From the root directory cd frontend && npm run build cd .. npm run build npm run start ``` ## Features ### Frontend Features - **Dashboard**: View high priority bids and today's bids - **Date Filtering**: - Click "3天" or "7天" buttons to filter bids from the last 3 or 7 days - The filter only limits the start date, showing all data from the selected start date onwards (including data newer than the end date) - **Keyword Filtering**: Filter bids by keywords (saved in localStorage) - **All Bids**: View all bids with pagination and source filtering - **Keyword Management**: Add and delete keywords with weight-based priority ### Backend Features - **Multi-Source Crawling**: Crawls bidding information from multiple sources: - ChdtpCrawler - ChngCrawler - SzecpCrawler - CdtCrawler - EpsCrawler - CnncecpCrawler - CgnpcCrawler - CeicCrawler - EspicCrawler - PowerbeijingCrawler - **Automatic Retry**: If a crawler returns 0 items, it will be retried after all crawlers complete - **Proxy Support**: Configurable proxy settings via environment variables - **Scheduled Tasks**: Automatic crawling at scheduled intervals ### Environment Variables ```env # Database DATABASE_TYPE=postgres DATABASE_HOST=localhost DATABASE_PORT=5432 DATABASE_USERNAME=your_username DATABASE_PASSWORD=your_password DATABASE_NAME=bidding DATABASE_SYNCHRONIZE=true # Proxy (optional) PROXY_HOST=your_proxy_host PROXY_PORT=your_proxy_port PROXY_USERNAME=your_proxy_username PROXY_PASSWORD=your_proxy_password ``` ## Initial Setup The system will automatically initialize with the preset keywords: "山东", "海", "建设", "工程", "采购". You can manage these and view crawled bidding information at http://localhost:3000.