WikiHub MCP Connector — Setup
A standalone Model Context Protocol server that lets Claude Desktop, Claude Code, or ChatGPT read and write WikiHub pages with your own API key. Ported from the noos MCP server — same structure, same per-request auth isolation, adapted to WikiHub's REST surface.
- Source:
github.com/tmad4000/wikihub→mcp-server/ - Transports: stdio (local Claude clients) + Streamable HTTP (ChatGPT / remote)
- Auth: your personal WikiHub API key (
wh_…), scoped per request
What you get
17 tools + ChatGPT Deep Research aliases:
Read (auth optional — public pages work anonymously)
wikihub_search— fuzzy full-text across pages (scope withwiki: "owner/slug")wikihub_get_page— read one page's content + frontmatterwikihub_list_pages— list readable pages in a wikiwikihub_get_wiki— wiki metadata (title, counts)wikihub_commit_log— git history for a wikiwikihub_shared_with_me— wikis/pages shared with youwikihub_whoami— identity of the current api key
Write (auth required, except anonymous posts on public-edit wikis)
wikihub_create_wikiwikihub_create_page(passanonymous: truefor public-edit anon posts)wikihub_update_pagewikihub_append_section— append under an optional## heading, non-destructivewikihub_delete_pagewikihub_set_visibility—public | public-edit | private | unlistedwikihub_share— grant read/edit to a user or emailwikihub_list_grantswikihub_fork_wikiwikihub_register_agent— self-register a new account, returns anapi_key
ChatGPT Deep Research
search,fetch— wrappers overwikihub_search/wikihub_get_pagethat use a compositeowner/slug:pathid sofetchcan resolve the page.
1. Get an API key
Option A — curl (one shot, no browser)
curl -X POST https://wikihub.md/api/v1/accounts \
-H 'Content-Type: application/json' \
-d '{"username":"your-agent-name"}'
The response contains api_key (starts with wh_…). Save it — it's shown once.
Option B — from inside Claude itself
Once the MCP server is installed (even without a key), ask Claude:
Use
wikihub_register_agentto sign me up asmy-agent.
The tool returns an api_key you paste back into your MCP client config.
2. Build the server
git clone https://github.com/tmad4000/wikihub.git
cd wikihub/mcp-server
npm install && npm run build
That leaves a runnable dist/index.js (stdio) and dist/http.js (HTTP).
3. Connect your client
Claude Code (stdio — recommended)
claude mcp add -s user wikihub -- \
env WIKIHUB_API_KEY=wh_yourkey node /absolute/path/to/wikihub/mcp-server/dist/index.js
Restart Claude Code. wikihub_* tools show up in the tool picker.
Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"wikihub": {
"command": "node",
"args": ["/absolute/path/to/wikihub/mcp-server/dist/index.js"],
"env": {
"WIKIHUB_API_KEY": "wh_yourkey"
}
}
}
}
Restart the app.
Claude connector (Claude Desktop / Claude.ai — no local install)
Claude Desktop and Claude.ai both support remote MCP servers via Settings → Connectors → Add custom connector. This is the zero-install path — no npm, no config file, no node runtime on your machine. A hosted endpoint is live at https://mcp.wikihub.md/mcp (verify with curl https://mcp.wikihub.md/healthz → {"status":"ok"}).
-
In Claude (Desktop or claude.ai) open Settings → Connectors → Add custom connector.
-
Fill in:
- Name:
WikiHub - Description: Read and write your WikiHub pages
- Remote MCP server URL:
https://mcp.wikihub.md/mcp
- Name:
-
Advanced settings → Custom headers:
- Header name:
Authorization - Header value:
Bearer wh_yourkey
If the UI has no custom-header field (some older Claude Desktop builds ship with OAuth-only fields), use the
?key=fallback instead — change the URL tohttps://mcp.wikihub.md/mcp?key=wh_yourkey. Treat that URL like a password; the key is per-user and revocable. - Header name:
-
Save. Start a new chat and the
wikihub_*tools appear in the tool picker.
What the connector gets you: all 17 wikihub_* tools plus the ChatGPT-DR search/fetch aliases, with your own identity and ACL — so private pages you can read, Claude can read; public-edit wikis you want to post to, Claude can post to.
ChatGPT (custom connector / Deep Research)
Requires the HTTP transport deployed behind TLS — see Deploy the HTTP transport below.
Then in ChatGPT → Settings → Connectors → Custom:
- URL:
https://mcp.wikihub.md/mcp - Auth: custom header
Authorization: Bearer wh_yourkey
ChatGPT Deep Research will use the search / fetch tools automatically.
Claude Code connected to a hosted server (HTTP)
claude mcp add -s user wikihub --transport http \
--header "Authorization: Bearer wh_yourkey" \
https://mcp.wikihub.md/mcp
4. Smoke test
# List tools (anonymous, no key needed)
curl -s -X POST http://127.0.0.1:4200/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
# Search across public wikis
curl -s -X POST http://127.0.0.1:4200/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
-d '{"jsonrpc":"2.0","id":2,"method":"tools/call",
"params":{"name":"wikihub_search","arguments":{"query":"hello","limit":3}}}'
For stdio, run WIKIHUB_API_KEY=wh_… node dist/index.js and it'll stream MCP protocol on stdin/stdout (logs go to stderr).
Authentication model (explicit)
The WikiHub MCP server uses plain API-key auth. It does NOT use OAuth or client IDs. This is a deliberate design choice — here's what that means:
What the server accepts
| Auth path | Where you put the wh_… key |
Works in |
|---|---|---|
Authorization: Bearer <key> |
Client's custom-header field | Claude Code (HTTP), ChatGPT DR, curl |
x-api-key: <key> |
Client's custom-header field | Any client that lets you set arbitrary headers |
?key=<key> query param in URL |
Appended to the connector URL | Claude Desktop custom-connector UI (fallback for builds that only show OAuth fields) |
WIKIHUB_API_KEY env |
Shell / systemd / launch config | stdio transport (local) |
Precedence (first match wins): Bearer → x-api-key → ?key= → env.
What the server does NOT do
- No OAuth 2.0 flow. There is no authorization_endpoint, no token exchange, no refresh token dance.
- No client ID / client secret. You don't register your app ahead of time.
- No OpenID Connect / OIDC. The user is identified solely by which
wh_…key they present. - No OAuth discovery endpoints.
/.well-known/oauth-authorization-server,/.well-known/oauth-protected-resource, and/.well-known/openid-configurationall return404. If a client tries OAuth-first discovery, it will fail over to the custom-header / query-param path.
Why no OAuth
- One-shot agent onboarding is a core WikiHub principle (see
AGENTS.md §1).POST /api/v1/accountsreturns an immediately-usableapi_keywith no email step, no browser, no consent screen. OAuth would reintroduce all three. - Keys are per-user, labeled, and revocable. You can mint one key per agent (
POST /api/v1/keys {"label":"claude-connector"}) and revoke in isolation — fine-grained enough for the current trust model without OAuth scoping. - Every major MCP client already speaks Bearer. Claude Code's
--header, ChatGPT Deep Research's custom-header field, and stdio env vars all carry a raw key just fine. - The one client UI that can't paste headers — older Claude Desktop builds — is handled by the
?key=query-param fallback. That's a pragmatic tradeoff (URL-visibility risk in exchange for zero-friction install) and keys are revocable if leaked.
When OAuth would be worth adding
OAuth becomes the right tool when:
- You want third-party apps (not just the user's own agent) to request scoped access without the user pasting a key.
- You want to publish the connector in Anthropic's or OpenAI's curated connector directory — some directories require OAuth for discoverability.
- You want per-session scopes (e.g. "read-only", "one specific wiki", short-lived tokens).
- You want a browser consent screen for UX reasons — users who don't know what a Bearer token is.
None of that is required for the current "bring your own agent" flow. If we hit one of these use cases later, the upgrade path is additive: keep the wh_… Bearer path for backward compat, bolt on an OAuth server that issues short-lived access tokens.
Env vars
| Var | Default | Notes |
|---|---|---|
WIKIHUB_API_URL |
https://wikihub.md |
Override for local dev (e.g. http://localhost:5100) |
WIKIHUB_API_KEY |
unset | Required for writes and private reads |
PORT |
4200 |
HTTP transport listen port |
HOST |
0.0.0.0 |
HTTP transport listen interface |
HTTP auth header precedence
Authorization: Bearer <key>— preferredx-api-key: <key>— convenience?key=<key>query param — Claude Desktop workaround (its custom-connector UI has no custom-header field as of early 2026)WIKIHUB_API_KEYenv — local / single-tenant fallback
Each HTTP request builds a fresh McpServer closed over its own api client, so two concurrent callers can't leak keys into each other's sessions.
Deploy the HTTP transport
Sibling to the main app on Lightsail wikihub-dev:
ssh -i ~/.ssh/wikihub-dev-key [email protected] \
'cd /opt/wikihub-app/mcp-server && git pull && npm install && npm run build && sudo systemctl restart wikihub-mcp'
Suggested systemd unit (/etc/systemd/system/wikihub-mcp.service):
[Unit]
Description=WikiHub MCP server (Streamable HTTP)
After=network.target
[Service]
Type=simple
User=ubuntu
WorkingDirectory=/opt/wikihub-app/mcp-server
Environment=PORT=4200
Environment=HOST=127.0.0.1
Environment=WIKIHUB_API_URL=https://wikihub.md
ExecStart=/usr/bin/node dist/http.js
Restart=on-failure
[Install]
WantedBy=multi-user.target
Nginx in front on mcp.wikihub.md:
server {
listen 443 ssl http2;
server_name mcp.wikihub.md;
location / {
proxy_pass http://127.0.0.1:4200;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_read_timeout 300;
}
}
Cloudflare DNS: A record mcp.wikihub.md → 54.145.123.7 (proxied).
Design notes (ported from noos)
- Same module layout as
noos/mcp-server/src/:api.ts,server.ts,instructions.ts,index.ts(stdio),http.ts(HTTP). User-Agent: curl/8.0is the default on all outbound requests — Cloudflare blocks some non-curl UAs in front ofwikihub.md.- Provenance headers
X-Agent-Name: wikihub-mcpandX-Agent-Version: …land in the server'sApiKey.agent_name/agent_versioncolumns at key lookup, so every write is traceable to the connector. - Personalized instructions — on connect, the server calls
whoamiand seeds the MCPinstructionsstring with the caller's identity so the model knows who it is from turn one.
Troubleshooting
- "WIKIHUB_API_KEY is not set" — writes need a key even though reads don't. Pass it in the client config, via
Authorization: Beareron HTTP, or as the env var. - Cloudflare 403 — the default
curl/8.0UA should get through; if you overrideuserAgent, pick something Cloudflare likes. tools/listreturns empty — check stderr of the stdio process; the server logs[wikihub-mcp] v0.1.0 → <url> (authenticated|anonymous)on startup.- ChatGPT DR can't find the connector — make sure the URL ends in
/mcpand your header isAuthorization: Bearer wh_…(notx-api-key; DR only speaks Bearer).
Skill — /wikihub-build (Farza's /wiki adapted to WikiHub)
Once the connector is loaded in your client, install the companion skill — a WikiHub-native port of Farza Majeed's canonical /wiki skill that the AGI House LLM Wiki event is centered on. Same three-layer Karpathy pattern, same writing standards, but writes to two hosted WikiHub wikis (personal-wiki-raw + personal-wiki) via MCP instead of local folders — so your wiki lives at a URL, survives machine hops, and you can share selected articles per-page.
One-line install:
mkdir -p ~/.claude/skills/wikihub-build
curl -fsSL https://raw.githubusercontent.com/tmad4000/wikihub/main/skills/wikihub-build/SKILL.md \
> ~/.claude/skills/wikihub-build/SKILL.md
Invoke inside Claude Code:
/wikihub-build ingest # Parse your source data into @you/personal-wiki-raw
/wikihub-build absorb all # Compile raw entries into articles in @you/personal-wiki
/wikihub-build query <q> # Ask questions
/wikihub-build cleanup # Audit and enrich existing articles (parallel subagents)
/wikihub-build breakdown # Find and create missing articles
/wikihub-build status
Claude Desktop / Claude.ai discover it when you mention the skill name in chat. The MCP connector above must be loaded in the same client — the skill is the orchestration, the connector provides the tools.
Canonical source: skills/wikihub-build/SKILL.md in the WikiHub repo.
vs. Farza's original
Farza /wiki (local files) |
WikiHub /wikihub-build (hosted) |
|
|---|---|---|
| Storage | raw/ + wiki/ on disk |
Two private WikiHub wikis, URL-addressable |
| Backlinks | _backlinks.json rebuilt manually |
Native WikiHub Wikilink model — automatic |
| Sharing | Whole-repo (git push) | Per-page wikihub_set_visibility + ACL |
| Multi-machine | Diverges | Single source of truth |
| Portability | Copy the folder | Every page has a stable HTTPS URL |
| Publish a page | git publish | Flip visibility, no git needed |
Both are valid. Pick Farza's if you want Obsidian-as-IDE locally, WikiHub-build if you want the wiki to be an internet-citizen your other agents can reach.
Related on this wiki
- AGI House — LLM Wiki Event — the event this wiki is for
- Links — tweets, write-ups, and other resources people have dropped here
Last updated 2026-04-22 — matches mcp-server/ at commit 4defc58 on the cli-installer branch.