# Duplicati Log Analysis Project This project provides tools for analyzing Duplicati backup logs using AI models via the OpenAI API (or compatible endpoints). The main goal is to automate the detection of abnormal backup events and provide diagnostic feedback. ## Project Structure - **ds_chat.py** - A standalone script that sends a Duplicati backup log to an AI model (via the OpenAI-compatible API) and prints whether the backup is "normal" or "abnormal". If abnormal, it also prints a diagnostic message. Useful for quick, manual log analysis. - **duplicati_analysis_server_openai.py** - Implements a simple HTTP server that listens for POST requests containing Duplicati backup logs. When a log is received, it: 1. Logs the incoming data to a file (`duplicati_logs.log`). 2. Sends the log to the AI model for analysis. 3. Returns a JSON response indicating if the backup is "normal" or "abnormal", along with any diagnostic message. - This script is suitable for integration with automated systems or webhooks that need real-time log analysis. ## Usage - **ds_chat.py**:Run directly to analyze a hardcoded log sample. ```bash python3 ds_chat.py ``` - **duplicati_analysis_server_openai.py**: Start the server to listen for log submissions (default port: 8680). ```bash python3 duplicati_analysis_server_openai.py ``` Then, POST logs to `http://:8680/` as form data. ## Requirements - Python 3.7+ - `openai` Python package (for API calls) - Network access to the OpenAI-compatible API endpoint ## Security - API keys are currently hardcoded for demonstration. For production, use environment variables or secure secrets management. --- **Note:** This project is designed for environments where Duplicati logs need to be programmatically analyzed for anomalies, leveraging AI for smarter diagnostics. ## Nextcloud Critical Data Backup Script ### Overview The `backup_nc6.sh` script automates the backup of critical Nextcloud volumes using Duplicati inside Docker. It is designed for reliability, flexibility, and easy integration with cron for scheduled backups. **Dependency:** The script requires that the `nextcloud_duplicati` Docker container (which runs Duplicati) is up and running before the backup jobs are executed. ### How It Works - **Stops all Nextcloud-related containers** to ensure data consistency. - **Runs a series of Duplicati backup jobs** (defined in the script) for each critical volume or data directory. - **Restarts all Nextcloud-related containers** after the backup completes. ### Usage 1. **Edit the script** if needed to match your container names, backup sources, or destinations. 2. **Make the script executable:** ```bash chmod +x /home/yazoo/appdev/duptest/backup_nc6.sh ``` 3. **Run manually:** ```/home/yazoo/appdev/duptest/backup_nc6.sh /home/yazoo/appdev/duptest/backup_nc6.sh ``` 4. **Set up as a cron job (every other day at 2:00 AM):** ``` 0 2 */2 * * [backup_nc6.sh](http://_vscodecontentref_/0) >> /var/log/backup_nc6.log 2>&1 ``` ### Notes * **Container Names:** The script uses the `NEXTCLOUD_CONTAINERS` array to define which containers to stop/start. Adjust this list if your Nextcloud stack uses different container names. * **Backup Jobs:** The `JOBS` array defines each backup task, including source, destination, and Duplicati options. Add or remove entries as needed. * **Duplicati Container:** The script assumes a dedicated Duplicati container (`nextcloud_duplicati`) with access to all relevant volumes. * **Email Notifications:** Backup results are emailed using the SMTP settings defined in the script. Update these credentials for your environment. * **Logging:** Redirect output to a log file for troubleshooting and audit purposes. * **Permissions:** Ensure the script runs as a user with permission to control Docker and access all relevant volumes. ## Requirements * Python 3.7+ * `openai` Python package (for API calls) * Network access to the OpenAI-compatible API endpoint ## Security * API keys are currently hardcoded for demonstration. For production, use environment variables or secure secrets management. ---