Buscador de complementos para Firefox
  • Extensiones
  • Temas
    • para Firefox
    • Diccionarios y paquetes de idiomas
    • Otros sitios de navegadores
    • Complementos para Android
Iniciar sesión
Vista previa de Local Manga Translator

Local Manga Translator por Camekan

Local Manga Translator allows you to read raw Manga (Japanese), Manhwa (Korean), and Manhua (Chinese) by capturing text from your browser and translating it using a powerful AI server running locally on your computer.

Algunas características pueden ser de pagoAlgunas características pueden ser de pago
0 (0 reviews)0 (0 reviews)
Descarga Firefox y obtiene la extensión
Descargar archivo

Metadata de la extensión

Sobre esta extensión
🦊 Firefox Add-on Description & Instructions

⚠️ IMPORTANT: REQUIRES COMPANION SCRIPT
This add-on is a "connector" tool. To perform the translation, you must run the free companion Python script (manga_server.py) on your computer.

✨ Key Features
  • Universal Hardware Support: Works on NVIDIA (CUDA), AMD/Intel (Vulkan), or CPU.
  • Specialized Japanese OCR: Uses Manga-OCR to read vertical, handwritten, and messy manga text perfectly.
  • Advanced Bubble Detection: Now uses Comic-Text-Detector (specialized for Manga/Manhwa) to accurately split connected bubbles and ignore background noise.
  • Smart Korean Mode: Uses PaddleOCR for high-accuracy recognition of Korean webtoons.
  • Natural AI Translation: Connects to local LLMs (like Qwen, Llama 3) for human-quality translation.
  • 100% Private & Free: No API keys, no monthly fees. Everything runs offline.



🛠️ Step-by-Step Installation Guide

Step 1: Install Visual Studio Build Tools (Windows Only)
  1. Download Visual Studio Build Tools 2022 from the Microsoft website.
  2. Run the installer.
  3. CRITICAL: Select the workload named "Desktop development with C++".
  4. Ensure the checklist on the right includes "Windows 10/11 SDK" and "MSVC... C++ x64/x86 build tools".
  5. Click Install and wait for it to finish.

Step 2: Install Python
  1. Download Python 3.10.11 from python.org.
  2. CRITICAL: During installation, check the box "Add Python to PATH".

Step 3: Download & Setup the Server Files

1. Get the Main Script
  • Download manga_server.py from thehttps://github.com/Camekan/Manga_server.py/blob/main/manga_server.py and place it in a new folder (e.g., C:\MangaTranslator).

2. Get the Comic Detector (Critical Step)
  • Download this ZIP file: https://github.com/dmMaze/comic-text-detector/archive/refs/heads/master.zip
  • Extract the ZIP. You will see a folder named comic-text-detector-master.
  • RENAME that folder to: comic_text_detector
  • ⚠️ Important: Use underscores _, not dashes -.
  • MOVE this folder next to manga_server.py.

3. Get the Detector Model (Optional - Auto-downloads on first run)
  • The script will try to download this automatically. If it fails, do this:
  • Download the model file: https://github.com/zyddnys/manga-image-translator/releases/download/beta-0.2.1/comictextdetector.pt
  • Create a new folder named models inside your comic_text_detector folder if it isn`t there..
  • Place the .pt file there.
  • Correct Path: C:\MangaTranslator\comic_text_detector\models\comictextdetector.pt

Your folder must look exactly like this:

MangaTranslator/
├── manga_server.py
└── comic_text_detector/ <-- The folder you renamed
├── inference.py <-- File inside
├── basemodel.py <-- File inside
└── models/ <-- Folder inside
└── comictextdetector.pt

Step 4: Install Dependencies

Open Command Prompt (cmd) inside the folder you extracted and run these commands:

1. Install Basic Tools:

pip install flask manga-ocr pytesseract pillow opencv-python numpy requests

2. Install Korean OCR (PaddleOCR):

pip install paddlepaddle paddleocr protobuf==3.20.3

3. Install AI Engine (Choose Your Hardware):
  • Option A: NVIDIA Users (Best Performance)

pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
  • Option B: AMD / Intel Users (Vulkan Mode)
  • Download and install the Vulkan SDK.
  • Run this command:

set CMAKE_ARGS="-DGGML_VULKAN=on" && pip install llama-cpp-python --upgrade --force-reinstall --no-cache-dir
  • Option C: CPU Only (Compatible but Slower)

pip install llama-cpp-python

Step 5: Get an AI Model
  1. Download a .gguf model (Recommended: Qwen2.5-14B-Instruct-Q4_K_M.gguf) from HuggingFace.
  2. Open manga_server.py with a text editor (like Notepad) and set the MODEL_PATH to point to your downloaded file.

Step 6: Run & Read!
  1. Double-click manga_server.py to start the server.
  2. First run note: It will automatically download the detection model (~100MB). Wait for it to finish.
  3. Open a manga page in Firefox.
  4. Press Alt+Q (or your custom hotkey) and draw a box over the text!

🔗 Download the Script Here: https://github.com/Camekan/Manga_server.py/blob/main/manga_server.py
Bug Reports: Please report issues on the GitHub Issues page.
Rated 0 by 0 reviewers
Inicia sesión para evaluar esta extensión
Todavía no hay valoraciones

Se guardó la valoración

5
0
4
0
3
0
2
0
1
0
Aún no hay revisiones
Permissions and data

Permisos requeridos:

  • Acceder a tus datos para todos los sitios web

Permisos opcionales:

  • Acceder a tus datos para 127.0.0.1:5000
  • Acceder a tus datos para 127.0.0.1
Saber más
Más información
Enlaces del complemento
  • Ayuda del sitio
Versión
2.8
Tamaño
16,04 KB
Última actualización
hace 20 días (21 de ene. de 2026)
Categorías relacionadas
  • Ayuda de idioma
  • Fotos, música y videos
Licencia
MIT License
Política de privacidad
Leer la política de privacidad de este complemento
Historial de versiones
  • Ver todas las versiones
Añadir a la colección
Informar sobre este complemento
Ir a la página de inicio de Mozilla

Complementos

  • Acerca de
  • Blog de complementos de Firefox
  • Taller de extensiones
  • Central del desarrollador
  • Normativas para desarrolladores
  • Blog de la comunidad
  • Foro
  • Informar de un error
  • Guía de revisión

Navegadores

  • Desktop
  • Mobile
  • Enterprise

Productos

  • Browsers
  • VPN
  • Relay
  • Monitor
  • Pocket
  • Bluesky (@firefox.com)
  • Instagram (Firefox)
  • YouTube (firefoxchannel)
  • Privacidad
  • Cookies
  • Legal

A menos que se indique lo contrario, el contenido de este sitio está licenciado bajo la licencia Creative Commons Reconocimiento Compartir-Igual v3.0 o una versión posterior.