larry1chan@qq.com vor 8 Monaten
Commit
b90c060229

+ 7 - 0
.gitignore

@@ -0,0 +1,7 @@
+config
+venv
+docker-duplicati
+backups/vol1
+backups/vol2
+backups/vol3
+

+ 3 - 0
.vscode/settings.json

@@ -0,0 +1,3 @@
+{
+    "editor.codeLensFontSize": 0
+}

+ 111 - 0
README.md

@@ -0,0 +1,111 @@
+# Duplicati Log Analysis Project
+
+This project provides tools for analyzing Duplicati backup logs using AI models via the OpenAI API (or compatible endpoints). The main goal is to automate the detection of abnormal backup events and provide diagnostic feedback.
+
+## Project Structure
+
+- **ds_chat.py**
+
+  - A standalone script that sends a Duplicati backup log to an AI model (via the OpenAI-compatible API) and prints whether the backup is "normal" or "abnormal". If abnormal, it also prints a diagnostic message. Useful for quick, manual log analysis.
+- **duplicati_analysis_server_openai.py**
+
+  - Implements a simple HTTP server that listens for POST requests containing Duplicati backup logs. When a log is received, it:
+    1. Logs the incoming data to a file (`duplicati_logs.log`).
+    2. Sends the log to the AI model for analysis.
+    3. Returns a JSON response indicating if the backup is "normal" or "abnormal", along with any diagnostic message.
+  - This script is suitable for integration with automated systems or webhooks that need real-time log analysis.
+
+## Usage
+
+- **ds_chat.py**:Run directly to analyze a hardcoded log sample.
+
+  ```bash
+  python3 ds_chat.py
+  ```
+- **duplicati_analysis_server_openai.py**:
+  Start the server to listen for log submissions (default port: 8680).
+
+  ```bash
+  python3 duplicati_analysis_server_openai.py
+  ```
+
+  Then, POST logs to `http://<server>:8680/` as form data.
+
+## Requirements
+
+- Python 3.7+
+- `openai` Python package (for API calls)
+- Network access to the OpenAI-compatible API endpoint
+
+## Security
+
+- API keys are currently hardcoded for demonstration. For production, use environment variables or secure secrets management.
+
+---
+
+**Note:**
+This project is designed for environments where Duplicati logs need to be programmatically analyzed for anomalies, leveraging AI for smarter diagnostics.
+
+
+## Nextcloud Critical Data Backup Script
+
+### Overview
+
+The `backup_nc6.sh` script automates the backup of critical Nextcloud volumes using Duplicati inside Docker. It is designed for reliability, flexibility, and easy integration with cron for scheduled backups.
+
+**Dependency:**
+
+The script requires that the `nextcloud_duplicati` Docker container (which runs Duplicati) is up and running before the backup jobs are executed.
+
+### How It Works
+
+- **Stops all Nextcloud-related containers** to ensure data consistency.
+- **Runs a series of Duplicati backup jobs** (defined in the script) for each critical volume or data directory.
+- **Restarts all Nextcloud-related containers** after the backup completes.
+
+### Usage
+
+1. **Edit the script** if needed to match your container names, backup sources, or destinations.
+2. **Make the script executable:**
+   ```bash
+   chmod +x /home/yazoo/appdev/duptest/backup_nc6.sh
+   ```
+
+
+3. **Run manually:**
+
+   ```/home/yazoo/appdev/duptest/backup_nc6.sh
+   /home/yazoo/appdev/duptest/backup_nc6.sh
+   ```
+4. **Set up as a cron job (every other day at 2:00 AM):**
+
+   ```
+   0 2 */2 * * [backup_nc6.sh](http://_vscodecontentref_/0) >> /var/log/backup_nc6.log 2>&1
+   ```
+
+### Notes
+
+* **Container Names:**
+  The script uses the `NEXTCLOUD_CONTAINERS` array to define which containers to stop/start. Adjust this list if your Nextcloud stack uses different container names.
+* **Backup Jobs:**
+  The `JOBS` array defines each backup task, including source, destination, and Duplicati options. Add or remove entries as needed.
+* **Duplicati Container:**
+  The script assumes a dedicated Duplicati container (`nextcloud_duplicati`) with access to all relevant volumes.
+* **Email Notifications:**
+  Backup results are emailed using the SMTP settings defined in the script. Update these credentials for your environment.
+* **Logging:**
+  Redirect output to a log file for troubleshooting and audit purposes.
+* **Permissions:**
+  Ensure the script runs as a user with permission to control Docker and access all relevant volumes.
+
+## Requirements
+
+* Python 3.7+
+* `openai` Python package (for API calls)
+* Network access to the OpenAI-compatible API endpoint
+
+## Security
+
+* API keys are currently hardcoded for demonstration. For production, use environment variables or secure secrets management.
+
+---

+ 67 - 0
backup_nc6.sh

@@ -0,0 +1,67 @@
+#!/bin/bash
+# filepath: /home/yazoo/appdev/duptest/backup_nc_critical.sh
+
+# --- Common Variables ---
+NEXTCLOUD_CONTAINERS=("nc6" "nc6_govod" "nc6_cron" "nc6_clamav" "nc6_db" "nc6_onlyoffice" "nc6_redis")
+DUPLICATI_CONTAINER="nextcloud_duplicati"
+DUPLICATI_CLI="/app/duplicati/duplicati-cli"
+MAIL_URL="smtp://smtp.qq.com:587?starttls=always"
+MAIL_FROM="larry1chan@qq.com"
+MAIL_USER="larry1chan@qq.com"
+MAIL_PASS="bnnxkyaajruteega"
+MAIL_TO="larry1chan@gmail.com"
+COMMON_OPTS="--send-mail-url=\"$MAIL_URL\" --send-mail-from=\"$MAIL_FROM\" --send-mail-any-operation=false --send-mail-password=$MAIL_PASS --send-mail-username=\"$MAIL_USER\" --send-mail-to=\"$MAIL_TO\" --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input"
+
+# --- Backup Job Definitions ---
+JOBS=(
+  "Nextcloud db|/config/BAWCJCSDVZ.sqlite|DB-2|/mnt/db/|file:///backups/nc/db/"
+  "Nextcloud es_index|/config/AXUHBDDMXV.sqlite|DB-3|/mnt/es_index/|file:///backups/nc/es_index/"
+  "Nextcloud oo_data|/config/LPVNROTOBP.sqlite|DB-4|/mnt/oo_data/|file:///backups/nc/oo_data/"
+  "Nextcloud clamav|/config/UUBOPCEJTB.sqlite|DB-5|/mnt/clamav/|file:///backups/nc/clamav/"
+  "Nextcloud files|/config/KWDLIRMVDX.sqlite|DB-6|/mnt/files/|file:///backups/nc/files/"
+  "Nextcloud files - Jennifer|/config/LADZOGOMGJ.sqlite|DB-7|/mnt/jennifer/|file:///backups/nc/jennifer/"
+  "Nextcloud files - Larry|/config/VHNRBJWTOB.sqlite|DB-8|/mnt/larry/|file:///backups/nc/larry/"
+  "Nextcloud files - michelle|/config/NBLWZGCUWA.sqlite|DB-9|/mnt/michelle/|file:///backups/nc/michelle/"
+  "Nextcloud files - mindy|/config/CMLORKRFEV.sqlite|DB-1|/mnt/mindy/|file:///backups/nc/mindy/"
+)
+
+#  JOBS=(
+#   "Nextcloud db|/config/BAWCJCSDVZ.sqlite|DB-2|/mnt/db/|file:///backups/nc/db/"
+#  )
+
+
+
+
+# --- Stop Nextcloud Containers ---
+echo "Stopping Nextcloud containers..."
+for c in "${NEXTCLOUD_CONTAINERS[@]}"; do
+    if docker ps --filter "name=$c" --filter "status=running" | grep -q "$c"; then
+        echo "Stopping $c"
+        docker stop "$c"
+    else
+        echo "$c is not running or does not exist."
+    fi
+done
+
+# --- Backup Loop ---
+for job in "${JOBS[@]}"; do
+  IFS="|" read -r BACKUP_NAME DBPATH BACKUP_ID SRC DST <<< "$job"
+  echo "Starting backup: $BACKUP_NAME"
+  docker exec -i "$DUPLICATI_CONTAINER" $DUPLICATI_CLI backup "$DST" "$SRC" \
+    $COMMON_OPTS \
+    --backup-name="$BACKUP_NAME" \
+    --dbpath="$DBPATH" \
+    --backup-id="$BACKUP_ID"
+  echo "Finished backup: $BACKUP_NAME"
+done
+
+# --- Start Nextcloud Containers ---
+echo "Starting Nextcloud containers..."
+for c in "${NEXTCLOUD_CONTAINERS[@]}"; do
+    if docker ps -a --filter "name=$c" | grep -q "$c"; then
+        echo "Starting $c"
+        docker start "$c"
+    else
+        echo "$c does not exist."
+    fi
+done

+ 35 - 0
backups/nc_critical_data/critical nextcloud duplicati backup info.txt

@@ -0,0 +1,35 @@
+# Common arguments for Duplicati backup job
+# send-mail-url="smtp://smtp.qq.com:587?starttls=always"
+# send-mail-from="larry1chan@qq.com"
+# send-mail-password="bnnxkyaajruteega"
+# send-mail-username="larry1chan@qq.com"
+# send-mail-to="larry1chan@gmail.com"
+#
+#
+# docker command to run this script:
+# docker exec -it nextcloud_duplicati "/app/duplicati/duplicati-cli" backup ..."
+
+# nextcloud db backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/db/ /mnt/db/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud db" --dbpath=/config/BAWCJCSDVZ.sqlite --backup-id=DB-2 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+# nextcloud es_index backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/es_index/ /mnt/es_index/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud es_index" --dbpath=/config/AXUHBDDMXV.sqlite --backup-id=DB-3 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+
+# nextcloud oo_data backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/oo_data/ /mnt/oo_data/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud oo_data" --dbpath=/config/LPVNROTOBP.sqlite --backup-id=DB-4 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+# nextcloud clamav backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/clamav/ /mnt/clamav/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud clamav" --dbpath=/config/UUBOPCEJTB.sqlite --backup-id=DB-5 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+# nextcloud files backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/files/ /mnt/files/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud files" --dbpath=/config/KWDLIRMVDX.sqlite --backup-id=DB-6 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+# nextcloud data jennifer backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/jennifer/ /mnt/jennifer/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud files - Jennifer" --dbpath=/config/LADZOGOMGJ.sqlite --backup-id=DB-7 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+# nextcloud data larry backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/larry/ /mnt/larry/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud files - Larry" --dbpath=/config/VHNRBJWTOB.sqlite --backup-id=DB-8 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input
+
+# nextcloud data michelle backup
+/app/duplicati/duplicati-cli backup file:///backups/nc/michelle/ /mnt/michelle/ --send-mail-url="smtp://smtp.qq.com:587?starttls=always" --send-mail-from="larry1chan@qq.com" --send-mail-any-operation=false --send-mail-password=bnnxkyaajruteega --send-mail-username="larry1chan@qq.com" --send-mail-to="larry1chan@gmail.com" --backup-name="Nextcloud files - michelle" --dbpath=/config/NBLWZGCUWA.sqlite --backup-id=DB-9 --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input

+ 34 - 0
docker-compose.yml

@@ -0,0 +1,34 @@
+---
+volumes:
+  dutest_vol1:
+    name: dutest_vol1
+    driver: local
+  dutest_vol2:
+    name: dutest_vol2
+    driver: local
+  dutest_vol3:
+    name: dutest_vol3
+    driver: local
+
+services:
+
+
+  duplicati:
+    image: lscr.io/linuxserver/duplicati:latest
+    container_name: dutest
+    environment:
+      - PUID=0
+      - PGID=0
+      - TZ=Asia/Hong_Kong
+      - SETTINGS_ENCRYPTION_KEY=12345678
+      - CLI_ARGS= #optional
+      - DUPLICATI__WEBSERVICE_PASSWORD=admin #optional
+    volumes:
+      - ./config:/config
+      - ./backups:/backups
+      - dutest_vol1:/mnt/vol1
+      - dutest_vol2:/mnt/vol2
+      - dutest_vol3:/mnt/vol3
+    ports:
+      - 8220:8200
+    restart: no 

+ 5 - 0
docker_startstop_containers.sh

@@ -0,0 +1,5 @@
+docker run --rm -it \
+    -v /var/run/docker.sock:/var/run/docker.sock \
+    -v ./startstop_containers.sh:/startstop_containers.sh \
+    ubuntu:latest \
+    bash -c "apt-get update && apt-get install -y docker.io && bash /startstop_containers.sh stop"

Datei-Diff unterdrückt, da er zu groß ist
+ 6 - 0
ds_chat.py


+ 110 - 0
duplicati_analysis_server.py

@@ -0,0 +1,110 @@
+from http.server import BaseHTTPRequestHandler, HTTPServer
+from urllib.parse import parse_qs
+import json
+import logging
+from datetime import datetime
+import requests  # For making HTTP requests to DeepSeek API
+
+# Configure logging
+logging.basicConfig(
+    filename="duplicati_logs.log",
+    level=logging.INFO,
+    format="%(asctime)s - %(message)s",
+)
+
+# DeepSeek API configuration
+DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions"  # DeepSeek Chat API endpoint
+DEEPSEEK_API_KEY = "sk-175159668af9430ea6208f5636b24198"  # Replace with your DeepSeek API key
+
+class DuplicatiLogHandler(BaseHTTPRequestHandler):
+    def do_POST(self):
+        # Get the length of the data
+        content_length = int(self.headers["Content-Length"])
+        # Read the POST data
+        post_data = self.rfile.read(content_length).decode("utf-8")
+        # Parse the form-urlencoded data
+        data = parse_qs(post_data)
+
+        # Log the data
+        self.log_data(data)
+
+        # Send the log message to DeepSeek Chat API for analysis
+        analysis_result, diagnostic_message = self.send_to_deepseek_chat(data)
+        if analysis_result:
+            logging.info(f"DeepSeek Analysis Result: {analysis_result}")
+            if diagnostic_message:
+                logging.info(f"Diagnostic Message: {diagnostic_message}")
+
+        # Send a response
+        self.send_response(200)
+        self.send_header("Content-type", "application/json")
+        self.end_headers()
+        response = {
+            "status": "success",
+            "message": "Log received",
+            "analysis_result": analysis_result,
+            "diagnostic_message": diagnostic_message,
+        }
+        self.wfile.write(json.dumps(response).encode("utf-8"))
+
+    def log_data(self, data):
+        """Log the received data to a file."""
+        log_entry = {
+            "timestamp": datetime.now().isoformat(),
+            "data": data,
+        }
+        logging.info(json.dumps(log_entry))
+
+    def send_to_deepseek_chat(self, data):
+        """Send the log message to DeepSeek Chat API for analysis."""
+        try:
+            # Prepare the payload for DeepSeek Chat API
+            payload = {
+                "model": "deepseek-chat",  # Specify the model to use
+                "messages": [
+                    {
+                        "role": "system",
+                        "content": "You are an operator analyzing backup logs. "
+                                  "Check the log message for abnormalities and respond with 'normal' or 'abnormal'. "
+                                  "If the backup is abnormal, provide a detailed diagnostic message explaining the issue and suggesting possible fixes.",
+                    },
+                    {
+                        "role": "user",
+			"content": json.dumps(data),  # Pass the raw data as a JSON string
+                    },
+                ],
+            }
+
+            # Make a POST request to DeepSeek Chat API
+            headers = {
+                "Authorization": f"Bearer {DEEPSEEK_API_KEY}",
+                "Content-Type": "application/json",
+            }
+            response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload)
+            response.raise_for_status()  # Raise an error for bad status codes
+
+            # Parse the response
+            result = response.json()
+            chat_response = result["choices"][0]["message"]["content"]
+
+            # Extract analysis result and diagnostic message
+            if "abnormal" in chat_response.lower():
+                analysis_result = "abnormal"
+                diagnostic_message = chat_response  # Use the full response as the diagnostic message
+            else:
+                analysis_result = "normal"
+                diagnostic_message = None
+
+            return analysis_result, diagnostic_message
+        except requests.exceptions.RequestException as e:
+            logging.error(f"Failed to send data to DeepSeek Chat API: {e}")
+            return None, None
+
+def run(server_class=HTTPServer, handler_class=DuplicatiLogHandler, port=8680):
+    server_address = ("", port)
+    httpd = server_class(server_address, handler_class)
+    print(f"Starting Duplicati log server on port {port}...")
+    httpd.serve_forever()
+
+if __name__ == "__main__":
+    run()

+ 109 - 0
duplicati_analysis_server_openai.py

@@ -0,0 +1,109 @@
+from http.server import BaseHTTPRequestHandler, HTTPServer
+from urllib.parse import parse_qs
+import json
+import logging
+from datetime import datetime
+from openai import OpenAI # OpenAI Python client
+
+# Configure logging
+logging.basicConfig(
+    filename="duplicati_logs.log",
+    level=logging.INFO,
+    format="%(asctime)s - %(message)s",
+)
+
+# OpenAI API configuration
+API_KEY = "sk-175159668af9430ea6208f5636b24198"  # Replace with your OpenAI API key
+MODEL = "deepseek-chat"
+
+class DuplicatiLogHandler(BaseHTTPRequestHandler):
+    def do_POST(self):
+        # Get the length of the data
+        content_length = int(self.headers["Content-Length"])
+        # Read the POST data
+        post_data = self.rfile.read(content_length).decode("utf-8")
+        # Parse the form-urlencoded data
+        data = parse_qs(post_data)
+
+        # Log the data
+        self.log_data(data)
+
+        # Send the log message to OpenAI API for analysis
+        analysis_result, diagnostic_message = self.send_to_openai(data)
+        if analysis_result:
+            logging.info(f"OpenAI Analysis Result: {analysis_result}")
+            if diagnostic_message:
+                logging.info(f"Diagnostic Message: {diagnostic_message}")
+
+        # Send a response
+        self.send_response(200)
+        self.send_header("Content-type", "application/json")
+        self.end_headers()
+        response = {
+            "status": "success",
+            "message": "Log received",
+            "analysis_result": analysis_result,
+            "diagnostic_message": diagnostic_message,
+        }
+        self.wfile.write(json.dumps(response).encode("utf-8"))
+
+    def log_data(self, data):
+        """Log the received data to a file."""
+        log_entry = {
+            "timestamp": datetime.now().isoformat(),
+            "data": data,
+        }
+        logging.info(json.dumps(log_entry))
+
+    def send_to_openai(self, data):
+        """Send the raw log data to OpenAI API for analysis."""
+        try:
+            # Prepare the messages for OpenAI API
+            messages = [
+                {
+                    "role": "system",
+                    "content": "You are an operator analyzing backup logs. "
+                              "Check the log message for abnormalities and respond with 'normal' or 'abnormal'. "
+                              "If the backup is abnormal, provide a detailed diagnostic message explaining the issue and suggesting possible fixes.",
+                },
+                {
+                    "role": "user",
+                    "content": json.dumps(data),  # Pass the raw data as a JSON string
+                },
+            ]
+
+            # Make a request to OpenAI API
+            client = OpenAI(api_key=API_KEY, base_url="https://api.deepseek.com")
+            response = client.chat.completions.create(
+                model=MODEL,
+                messages=messages,
+    max_tokens=1024,
+    temperature=0.7,
+    stream=False
+
+            )
+
+            # Parse the response
+            chat_response = response.choices[0].message.content
+
+            # Extract analysis result and diagnostic message
+            if "abnormal" in chat_response.lower():
+                analysis_result = "abnormal"
+                diagnostic_message = chat_response  # Use the full response as the diagnostic message
+            else:
+                analysis_result = "normal"
+                diagnostic_message = chat_response 
+
+            return analysis_result, diagnostic_message
+        except Exception as e:
+            logging.error(f"Failed to send data to OpenAI API: {e}")
+            return None, None
+
+def run(server_class=HTTPServer, handler_class=DuplicatiLogHandler, port=8680):
+    server_address = ("", port)
+    httpd = server_class(server_address, handler_class)
+    print(f"Starting Duplicati log server on port {port}...")
+    httpd.serve_forever()
+
+if __name__ == "__main__":
+    run()

+ 48 - 0
duplicati_log_server.py

@@ -0,0 +1,48 @@
+from http.server import BaseHTTPRequestHandler, HTTPServer
+from urllib.parse import parse_qs
+import json
+import logging
+from datetime import datetime
+
+# Configure logging
+logging.basicConfig(
+    filename="duplicati_logs.log",
+    level=logging.INFO,
+    format="%(asctime)s - %(message)s",
+)
+
+class DuplicatiLogHandler(BaseHTTPRequestHandler):
+    def do_POST(self):
+        # Get the length of the data
+        content_length = int(self.headers["Content-Length"])
+        # Read the POST data
+        post_data = self.rfile.read(content_length).decode("utf-8")
+        # Parse the form-urlencoded data
+        data = parse_qs(post_data)
+
+        # Log the data
+        self.log_data(data)
+
+        # Send a response
+        self.send_response(200)
+        self.send_header("Content-type", "application/json")
+        self.end_headers()
+        response = {"status": "success", "message": "Log received"}
+        self.wfile.write(json.dumps(response).encode("utf-8"))
+
+    def log_data(self, data):
+        """Log the received data to a file."""
+        log_entry = {
+            "timestamp": datetime.now().isoformat(),
+            "data": data,
+        }
+        logging.info(json.dumps(log_entry))
+
+def run(server_class=HTTPServer, handler_class=DuplicatiLogHandler, port=8680):
+    server_address = ("", port)
+    httpd = server_class(server_address, handler_class)
+    print(f"Starting Duplicati log server on port {port}...")
+    httpd.serve_forever()
+
+if __name__ == "__main__":
+    run()

Datei-Diff unterdrückt, da er zu groß ist
+ 0 - 0
duplicati_logs.log


+ 1 - 0
remove_containers_and_volumes_strict.log

@@ -0,0 +1 @@
+2025-03-10 05:44:30 - ERROR: Failed to stop container: compact-box

+ 44 - 0
rm_vol.sh

@@ -0,0 +1,44 @@
+#!/bin/bash
+
+# Arrays containing container names and their associated volumes
+CONTAINER_NAMES=("compact-box" )  # Replace with your container names
+VOLUMES=("dutest_vol1" "dutest_vol2" "dutest_vol3")                   # Replace with your volume names
+
+# Log file for errors
+LOG_FILE="remove_containers_and_volumes_strict.log"
+> "$LOG_FILE"  # Clear the log file
+
+# Function to log errors and exit
+log_error_and_exit() {
+  echo "$(date '+%Y-%m-%d %H:%M:%S') - ERROR: $1" >> "$LOG_FILE"
+  echo "Script failed. Check $LOG_FILE for details." >&2
+  exit 1
+}
+
+# Iterate through the containers and volumes
+for i in "${!CONTAINER_NAMES[@]}"; do
+  CONTAINER_NAME="${CONTAINER_NAMES[$i]}"
+  VOLUME="${VOLUMES[$i]}"
+
+  echo "Processing container: $CONTAINER_NAME"
+
+  # Stop the container
+  if ! docker stop "$CONTAINER_NAME" > /dev/null 2>&1; then
+    log_error_and_exit "Failed to stop container: $CONTAINER_NAME"
+  fi
+  echo "Stopped container: $CONTAINER_NAME"
+
+  # Remove the container
+  if ! docker rm "$CONTAINER_NAME" > /dev/null 2>&1; then
+    log_error_and_exit "Failed to remove container: $CONTAINER_NAME"
+  fi
+  echo "Removed container: $CONTAINER_NAME"
+
+  # Remove the associated volume
+  if ! docker volume rm "$VOLUME" > /dev/null 2>&1; then
+    log_error_and_exit "Failed to remove volume: $VOLUME"
+  fi
+  echo "Removed volume: $VOLUME"
+done
+
+echo "Script execution completed successfully."

+ 1 - 0
run.sh

@@ -0,0 +1 @@
+docker run -it --rm --name compact -v dutest_vol1:/mnt/vol1 -v dutest_vol2:/mnt/vol2 -v dutest_vol3:/mnt/vol3  compact-box

+ 40 - 0
startstop_containers.sh

@@ -0,0 +1,40 @@
+#!/bin/bash
+
+# Array of container names or IDs
+containers=("nc6" "nc6_govod" "nc6_cron" "nc6_clamav" "nc6_db" "nc6_onlyoffice" "nc6_redis")
+
+# Function to stop containers
+stop_containers() {
+    for container in "${containers[@]}"; do
+        if docker ps --filter "name=$container" --filter "status=running" | grep -q "$container"; then
+            echo "Stopping container: $container"
+            docker stop "$container"
+        else
+            echo "Container $container is not running or does not exist."
+        fi
+    done
+}
+
+# Function to start containers
+start_containers() {
+    for container in "${containers[@]}"; do
+        if docker ps --filter "name=$container" --filter "status=running" | grep -q "$container"; then
+            echo "Container $container is already running."
+        else
+            echo "Starting container: $container"
+            docker start "$container"
+        fi
+    done
+}
+
+# Main routine
+if [[ "$1" == "stop" ]]; then
+    echo "Shutting down containers..."
+    stop_containers
+elif [[ "$1" == "start" ]]; then
+    echo "Bringing up containers..."
+    start_containers
+else
+    echo "Usage: $0 {start|stop}"
+    exit 1
+fi

Einige Dateien werden nicht angezeigt, da zu viele Dateien in diesem Diff geändert wurden.