Local Manga Translator ์ ์์: Camekan
Local Manga Translator allows you to read raw Manga (Japanese), Manhwa (Korean), and Manhua (Chinese) by capturing text from your browser and translating it using a powerful AI server running locally on your computer.
์ผ๋ถ ๊ธฐ๋ฅ์ ๊ฒฐ์ ๊ฐ ํ์ํ ์ ์์์ผ๋ถ ๊ธฐ๋ฅ์ ๊ฒฐ์ ๊ฐ ํ์ํ ์ ์์
์ฌ์ฉ์ 3๋ช
์ฌ์ฉ์ 3๋ช
ํ์ฅ ๋ฉํ ๋ฐ์ดํฐ
์ ๋ณด
๐ฆ Firefox Add-on Description & Instructions
โ ๏ธ IMPORTANT: REQUIRES COMPANION SCRIPT
This add-on is a "connector" tool. To perform the translation, you must run the free companion Python script (
โจ Key Features
๐ ๏ธ Step-by-Step Installation Guide
Step 1: Install Visual Studio Build Tools (Windows Only)
Step 2: Install Python
Step 3: Download & Setup the Server Files
1. Get the Main Script
2. Get the Comic Detector (Critical Step)
3. Get the Detector Model (Optional - Auto-downloads on first run)
Your folder must look exactly like this:
MangaTranslator/
โโโ manga_server.py
โโโ comic_text_detector/ <-- The folder you renamed
โโโ inference.py <-- File inside
โโโ basemodel.py <-- File inside
โโโ models/ <-- Folder inside
โโโ comictextdetector.pt
Step 4: Install Dependencies
Open Command Prompt (cmd) inside the folder you extracted and run these commands:
1. Install Basic Tools:
pip install flask manga-ocr pytesseract pillow opencv-python numpy requests
2. Install Korean OCR (PaddleOCR):
pip install paddlepaddle paddleocr protobuf==3.20.3
3. Install AI Engine (Choose Your Hardware):
pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
set CMAKE_ARGS="-DGGML_VULKAN=on" && pip install llama-cpp-python --upgrade --force-reinstall --no-cache-dir
pip install llama-cpp-python
Step 5: Get an AI Model
Step 6: Run & Read!
๐ Download the Script Here: https://github.com/Camekan/Manga_server.py/blob/main/manga_server.py
Bug Reports: Please report issues on the GitHub Issues page.
โ ๏ธ IMPORTANT: REQUIRES COMPANION SCRIPT
This add-on is a "connector" tool. To perform the translation, you must run the free companion Python script (
manga_server.py) on your computer.โจ Key Features
- Universal Hardware Support: Works on NVIDIA (CUDA), AMD/Intel (Vulkan), or CPU.
- Specialized Japanese OCR: Uses Manga-OCR to read vertical, handwritten, and messy manga text perfectly.
- Advanced Bubble Detection: Now uses Comic-Text-Detector (specialized for Manga/Manhwa) to accurately split connected bubbles and ignore background noise.
- Smart Korean Mode: Uses PaddleOCR for high-accuracy recognition of Korean webtoons.
- Natural AI Translation: Connects to local LLMs (like Qwen, Llama 3) for human-quality translation.
- 100% Private & Free: No API keys, no monthly fees. Everything runs offline.
๐ ๏ธ Step-by-Step Installation Guide
Step 1: Install Visual Studio Build Tools (Windows Only)
- Download Visual Studio Build Tools 2022 from the Microsoft website.
- Run the installer.
- CRITICAL: Select the workload named "Desktop development with C++".
- Ensure the checklist on the right includes "Windows 10/11 SDK" and "MSVC... C++ x64/x86 build tools".
- Click Install and wait for it to finish.
Step 2: Install Python
- Download Python 3.10.11 from python.org.
- CRITICAL: During installation, check the box "Add Python to PATH".
Step 3: Download & Setup the Server Files
1. Get the Main Script
- Download
manga_server.pyfrom thehttps://github.com/Camekan/Manga_server.py/blob/main/manga_server.py and place it in a new folder (e.g.,C:\MangaTranslator).
2. Get the Comic Detector (Critical Step)
- Download this ZIP file: https://github.com/dmMaze/comic-text-detector/archive/refs/heads/master.zip
- Extract the ZIP. You will see a folder named
comic-text-detector-master. - RENAME that folder to:
comic_text_detector - โ ๏ธ Important: Use underscores
_, not dashes-. - MOVE this folder next to
manga_server.py.
3. Get the Detector Model (Optional - Auto-downloads on first run)
- The script will try to download this automatically. If it fails, do this:
- Download the model file: https://github.com/zyddnys/manga-image-translator/releases/download/beta-0.2.1/comictextdetector.pt
- Create a new folder named
modelsinside yourcomic_text_detectorfolder if it isn`t there.. - Place the
.ptfile there. - Correct Path:
C:\MangaTranslator\comic_text_detector\models\comictextdetector.pt
Your folder must look exactly like this:
MangaTranslator/
โโโ manga_server.py
โโโ comic_text_detector/ <-- The folder you renamed
โโโ inference.py <-- File inside
โโโ basemodel.py <-- File inside
โโโ models/ <-- Folder inside
โโโ comictextdetector.pt
Step 4: Install Dependencies
Open Command Prompt (cmd) inside the folder you extracted and run these commands:
1. Install Basic Tools:
pip install flask manga-ocr pytesseract pillow opencv-python numpy requests
2. Install Korean OCR (PaddleOCR):
pip install paddlepaddle paddleocr protobuf==3.20.3
3. Install AI Engine (Choose Your Hardware):
- Option A: NVIDIA Users (Best Performance)
pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
- Option B: AMD / Intel Users (Vulkan Mode)
- Download and install the Vulkan SDK.
- Run this command:
set CMAKE_ARGS="-DGGML_VULKAN=on" && pip install llama-cpp-python --upgrade --force-reinstall --no-cache-dir
- Option C: CPU Only (Compatible but Slower)
pip install llama-cpp-python
Step 5: Get an AI Model
- Download a
.ggufmodel (Recommended: Qwen2.5-14B-Instruct-Q4_K_M.gguf) from HuggingFace. - Open
manga_server.pywith a text editor (like Notepad) and set theMODEL_PATHto point to your downloaded file.
Step 6: Run & Read!
- Double-click
manga_server.pyto start the server. - First run note: It will automatically download the detection model (~100MB). Wait for it to finish.
- Open a manga page in Firefox.
- Press Alt+Q (or your custom hotkey) and draw a box over the text!
๐ Download the Script Here: https://github.com/Camekan/Manga_server.py/blob/main/manga_server.py
Bug Reports: Please report issues on the GitHub Issues page.
0๋ช
์ด 0์ ์ผ๋ก ํ๊ฐํจ
๊ถํ ๋ฐ ๋ฐ์ดํฐ
ํ์ ๊ถํ:
- ๋ชจ๋ ์น์ฌ์ดํธ์์ ์ฌ์ฉ์์ ๋ฐ์ดํฐ์ ์ ๊ทผ
์ ํ์ ๊ถํ:
- 127.0.0.1:5000์์ ์ฌ์ฉ์์ ๋ฐ์ดํฐ์ ์ ๊ทผ
- 127.0.0.1์์ ์ฌ์ฉ์์ ๋ฐ์ดํฐ์ ์ ๊ทผ
์ถ๊ฐ ์ ๋ณด
- ๋ถ๊ฐ ๊ธฐ๋ฅ ๋งํฌ
- ๋ฒ์
- 2.8
- ํฌ๊ธฐ
- 16.04 KB
- ๋ง์ง๋ง ์ ๋ฐ์ดํธ
- 2๋ฌ ์ (2026๋ 1์ 21์ผ)
- ๊ด๋ จ ์นดํ ๊ณ ๋ฆฌ
- ๋ผ์ด์ ์ค
- MIT ๋ผ์ด์ ์ค
- ๊ฐ์ธ์ ๋ณด์ฒ๋ฆฌ๋ฐฉ์นจ
- ์ด ๋ถ๊ฐ ๊ธฐ๋ฅ์ ๋ํ ๊ฐ์ธ์ ๋ณด์ฒ๋ฆฌ๋ฐฉ์นจ ์ฝ๊ธฐ
- ๋ฒ์ ๋ชฉ๋ก
- ๋ชจ์์ง์ ์ถ๊ฐ