Neat URL 的评价
Neat URL 作者: Geoffrey De Belie
22 条评价
- 非常棒的插件,少给一星,以示鼓励,国内用户可以手动添加以下规则,屏蔽某度等链接追踪:
by@quark.cn, by2@quark.cn, from@quark.cn, uc_param_str@quark.cn, ch@baidu.com, eqid@baidu.com, f@baidu.com, lid@baidu, inputT@baidu.com, oq@baidu.com, prefixsug@baidu.com, rqid@baidu.com, rsf@baidu.com, rsp@baidu.com, rsv_bp@baidu.com, rsv_btype@baidu.com, rsv_dl@baidu.com, rsv_enter@baidu.com, rsv_idx@baidu.com, rsv_pq@baidu.com, rsv_sug1@baidu.com, rsv_sug2@baidu.com, rsv_sug3@baidu.com, rsv_sug4@baidu.com, rsv_sug5@baidu.com, rsv_sug6@baidu.com, rsv_sug7@baidu.com, rsv_sug8@baidu.com, rsv_sug9@baidu.com, rsv_t@baidu.com, sa@baidu.com, tn@baidu.com, usm@baidu.com, bd_page_type@m.baidu.com, cst@m.baidu.com, ct@m.baidu.com, di@m.baidu.com, dict@m.baidu.com, eqid@m.baidu.com, fr@m.baidu.com, from@m.baidu.com, is_baidu@m.baidu.com, isAtom@m.baidu.com, lid@m.baidu.com, ms@m.baidu.com, pu@m.baidu.com, ref@m.baidu.com, sa@m.baidu.com, sec@m.baidu.com, sectab@m.baidu.com, ssid@m.baidu.com, tcplug@m.baidu.com, tj@m.baidu.com, tn@m.baidu.com, uid@m.baidu.com, vslid@m.baidu.com, w_qd@m.baidu.com, list_entrance@ixigua.com, logTag@ixigua.com, preActiveKey@ixigua.com, list_entrance@ixigua.com, _signature@ixigua.com, search_source@search.bilibili.com, spm_id_from@space.bilibili.com, spm_id_from@bilibili.com, vd_source@bilibili.com, ct@image.baidu.com, fr@tieba.baidu.com, fr@zhidao.baidu.com, tn@zhidao.baidu.com, pca@ireader.com.cn - 评分 4 / 5来自 Firefox 用户 5612562, 4 年前Moved away from CleanLinks because it would break too many sites with the embedded url feature. I do wish Neat URL had regex support and a UI to easily add more blocks and to troubleshoot/see what has been blocked, but it works great either way. I'm attaching my custom blocking rules that include stuff I've exported from CleanLinks- https://pastebin.com/Hgae9Ans
- 评分 4 / 5来自 peacefulpotato, 4 年前I used ClearURLs for a while until I learned that it made constant requests to Cloudflare. This one is almost perfect but it doesn't seem to work on AMO. Probably a browser restriction, though.
- 评分 4 / 5来自 Firefox 用户 16009164, 5 年前
- Interesting addon.
Seems to works fine but sometimes fail. For example with url like this:
`https://www.itaka.pl/wczasy/egipt/hurghada/hotel-titanic-beach-spa-aqua-park,HRGTITA.html?ofr_id=2193798335ea5d9ebe13b26208d0a77c9a4e6c265d2049538db5f5d51892baf8&adults=2&childs=0#utm_source=facebook&utm_medium=social&utm_campaign=fb_post&utm_content=Egipt_Hotel_Titanic_Beach_Spa_23_10_2019&affiliate=32232665`
Ps hash (#) doesn't work too - 评分 4 / 5来自 Firefox 用户 14619974, 6 年前Nice addon, but looks like in don't work with "spm" and "epid" parameters
- 评分 4 / 5来自 Aurora of Earth, 6 年前
- 评分 4 / 5来自 Firefox 用户 14433072, 6 年前
- 评分 4 / 5来自 Firefox 用户 14209964, 6 年前
- 评分 4 / 5来自 Firefox 用户 14329427, 6 年前
- 评分 4 / 5来自 Firefox 用户 14172470, 7 年前
- 评分 4 / 5来自 The Beard Below My Chin, 7 年前From a glance the rules are not easy on the eyes and looks like an inconvenience to maintain. It could be improved if they were formatted by newline and there was a switch for case sensitivity. With uMatrix as inspiration, I propose this format and I fully understand if this is out of the scope of the project.
retain domain (case sensitive | insensitive) (comma separated params/spaces ignored from here){newline} //keep matching params, remove everything else
delete domain (case sensitive | insensitive) (comma separated params/spaces ignored from here){newline} //remove matching params, keep everything else
cutoff domain (case sensitive | insensitive) (relax = $/|force = $$/) (single param){newline} //remove all parameters after matching single param
ignore domain{newline} //no operation, paramerters untouched/whitelist
local rules and ignore override global rules.
example:
delete * sensitive badparam, evilparam, etc //globally remove, case sensitive
retain * insensitive okparam, goodparam, etc //globally keep, case insensitive
delete website.* sensitive evilref, badstuff, etc //locally remove, case sensitive
retain website.* sensitive evilref, badstuff, etc //locally keep, case sensitive
delete website.* insensitive okparam, goodparam, etc //locally remove, bypass global rules, case insensitive
retain somesite.com insensitive badparam, evilparam, etc //locally keep, bypass global rules, case insensitive
cutoff somesite.com sensitive force okparam //trim all parameters after 'okparam', relax = $/, force = $$/
ignore anothersite.com //whitelist - 评分 4 / 5来自 Firefox 用户 13570576, 7 年前it just do its job right, kudos to developer. but i think the toolbar icon its an overkill for this kind of extensions
- 评分 4 / 5来自 Firefox 用户 12765006, 7 年前Neat URL cleans URL's up well.
Good replacement for Pure URL.
It does not clean up URLs in a webpage when loading that webpage. The Pure URL addon or Clean Links addon did that.
Would like Neat URL to fix URL's within webpages when loading them. Plz add this feature.
The developers seem to be making Neat URL into a replacement for Pure URL. The mentions of feature parity in the changelogs are greatly appreciated. From what version onwards is the Neat URL at full feature parity?
I find myself searching for a good URL clean-up addon.
Neat URL is good but lacks dealing with redirects (including nested ones) and retrieving long URLs from shortened URLs.
I would like to consolidate URL clean-up in one addon. I would like to see URL cleaners take a more feature-rich direction.
Now I have to install Neat URL, Skip Redirect and Long URL Please Mod.
I also stumbled into two problems.
Adding a new rule to the bottom on a new line caused the addon to not use the line.
Using $.html@phys.org $html@phys.org or $.html@phys.* or $html@phys.* caused the extension to remove the html part from the link. Plz Fix.开发者回应
发布于 7 年前Thank you for your review.
* Neat URL is good but lacks dealing with redirects (including nested ones) and retrieving long URLs from shortened URLs.
That's why there exist other addons that do exactly that. I believe Neat URL should do one thing well, which is "cleaning URLs". Skipping redirects or retrieving long URLs from short ones will not be in the feature scope of Neat URL.
* Adding a new rule to the bottom on a new line caused the addon to not use the line.
I don't consider this to be a high priority issue. Might be improved in a later version.
* Using $.html@phys.org $html@phys.org or $.html@phys.* or $html@phys.* caused the extension to remove the html part from the link. Plz Fix.
Sorry, this is expected behaviour and working as intended. $ removes everything that comes after it when used as a Neat URL parameter.