Firefox Browser Eklentileri
  • Uzantılar
  • Temalar
    • Firefox için
    • Sözlükler ve dil paketleri
    • Diğer tarayıcı siteleri
    • Android eklentileri
Giriş
Local Manga Translator ön izlemesi

Local Manga Translator geliştiren: Camekan

Local Manga Translator allows you to read raw Manga (Japanese), Manhwa (Korean), and Manhua (Chinese) by capturing text from your browser and translating it using a powerful AI server running locally on your computer.

Bazı özellikler ücretli olabilirBazı özellikler ücretli olabilir
0 (0 inceleme)0 (0 inceleme)
Firefox’u indir ve uzantıyı yükle
Dosyayı indir

Uzantı meta verileri

Bu uzantı hakkında
🦊 Firefox Add-on Description & Instructions

⚠️ IMPORTANT: REQUIRES COMPANION SCRIPT
This add-on is a "connector" tool. To perform the translation, you must run the free companion Python script (manga_server.py) on your computer.

✨ Key Features
  • Universal Hardware Support: Works on NVIDIA (CUDA), AMD/Intel (Vulkan), or CPU.
  • Specialized Japanese OCR: Uses Manga-OCR to read vertical, handwritten, and messy manga text perfectly.
  • Advanced Bubble Detection: Now uses Comic-Text-Detector (specialized for Manga/Manhwa) to accurately split connected bubbles and ignore background noise.
  • Smart Korean Mode: Uses PaddleOCR for high-accuracy recognition of Korean webtoons.
  • Natural AI Translation: Connects to local LLMs (like Qwen, Llama 3) for human-quality translation.
  • 100% Private & Free: No API keys, no monthly fees. Everything runs offline.



🛠️ Step-by-Step Installation Guide

Step 1: Install Visual Studio Build Tools (Windows Only)
  1. Download Visual Studio Build Tools 2022 from the Microsoft website.
  2. Run the installer.
  3. CRITICAL: Select the workload named "Desktop development with C++".
  4. Ensure the checklist on the right includes "Windows 10/11 SDK" and "MSVC... C++ x64/x86 build tools".
  5. Click Install and wait for it to finish.

Step 2: Install Python
  1. Download Python 3.10.11 from python.org.
  2. CRITICAL: During installation, check the box "Add Python to PATH".

Step 3: Download & Setup the Server Files

1. Get the Main Script
  • Download manga_server.py from thehttps://github.com/Camekan/Manga_server.py/blob/main/manga_server.py and place it in a new folder (e.g., C:\MangaTranslator).

2. Get the Comic Detector (Critical Step)
  • Download this ZIP file: https://github.com/dmMaze/comic-text-detector/archive/refs/heads/master.zip
  • Extract the ZIP. You will see a folder named comic-text-detector-master.
  • RENAME that folder to: comic_text_detector
  • ⚠️ Important: Use underscores _, not dashes -.
  • MOVE this folder next to manga_server.py.

3. Get the Detector Model (Optional - Auto-downloads on first run)
  • The script will try to download this automatically. If it fails, do this:
  • Download the model file: https://github.com/zyddnys/manga-image-translator/releases/download/beta-0.2.1/comictextdetector.pt
  • Create a new folder named models inside your comic_text_detector folder if it isn`t there..
  • Place the .pt file there.
  • Correct Path: C:\MangaTranslator\comic_text_detector\models\comictextdetector.pt

Your folder must look exactly like this:

MangaTranslator/
├── manga_server.py
└── comic_text_detector/ <-- The folder you renamed
├── inference.py <-- File inside
├── basemodel.py <-- File inside
└── models/ <-- Folder inside
└── comictextdetector.pt

Step 4: Install Dependencies

Open Command Prompt (cmd) inside the folder you extracted and run these commands:

1. Install Basic Tools:

pip install flask manga-ocr pytesseract pillow opencv-python numpy requests

2. Install Korean OCR (PaddleOCR):

pip install paddlepaddle paddleocr protobuf==3.20.3

3. Install AI Engine (Choose Your Hardware):
  • Option A: NVIDIA Users (Best Performance)

pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
  • Option B: AMD / Intel Users (Vulkan Mode)
  • Download and install the Vulkan SDK.
  • Run this command:

set CMAKE_ARGS="-DGGML_VULKAN=on" && pip install llama-cpp-python --upgrade --force-reinstall --no-cache-dir
  • Option C: CPU Only (Compatible but Slower)

pip install llama-cpp-python

Step 5: Get an AI Model
  1. Download a .gguf model (Recommended: Qwen2.5-14B-Instruct-Q4_K_M.gguf) from HuggingFace.
  2. Open manga_server.py with a text editor (like Notepad) and set the MODEL_PATH to point to your downloaded file.

Step 6: Run & Read!
  1. Double-click manga_server.py to start the server.
  2. First run note: It will automatically download the detection model (~100MB). Wait for it to finish.
  3. Open a manga page in Firefox.
  4. Press Alt+Q (or your custom hotkey) and draw a box over the text!

🔗 Download the Script Here: https://github.com/Camekan/Manga_server.py/blob/main/manga_server.py
Bug Reports: Please report issues on the GitHub Issues page.
0 inceleyiciden 0 puan aldı
Bu uzantıya puan vermek için giriş yapın
Henüz hiç puan yok

Puan kaydedildi

5
0
4
0
3
0
2
0
1
0
Henüz inceleme yapılmamış
İzinler ve veriler

Gerekli izinler:

  • Tüm web sitelerine ait verilerinize erişme

İsteğe bağlı izinler:

  • 127.0.0.1:5000 verilerinize erişme
  • 127.0.0.1 verilerinize erişme
Daha fazla bilgi al
Daha fazla bilgi
Eklenti bağlantıları
  • Destek sitesi
Sürüm
2.8
Boyut
16,04 KB
Son güncelleme
18 gün önce (21 Oca 2026)
İlgili kategoriler
  • Dil Desteği
  • Fotoğraf, Müzik ve Videolar
Lisans
MIT Lisansı
Gizlilik ilkeleri
Bu eklentinin gizlilik ilkelerini okuyun
Sürüm geçmişi
  • Tüm sürümleri göster
Koleksiyona ekle
Bu eklentiyi rapor et
Mozilla'nın ana sayfasına gidin

Eklentiler

  • Hakkında
  • Firefox Eklentileri Blogu
  • Uzantı Atölyesi
  • Geliştirici Merkezi
  • Geliştirici Politikaları
  • Topluluk Blogu
  • Forum
  • Hata bildir
  • İnceleme rehberi

Tarayıcılar

  • Desktop
  • Mobile
  • Enterprise

Ürünler

  • Browsers
  • VPN
  • Relay
  • Monitor
  • Pocket
  • Bluesky (@firefox.com)
  • Instagram (Firefox)
  • YouTube (firefoxchannel)
  • Gizlilik
  • Çerezler
  • Hukuki Bilgiler

Aksi belirtilmedikçe bu sitedeki içerikler Creative Commons Attribution Share-Alike Lisansı v3.0 veya daha yeni sürümüyle lisanslanmıştır.