File size: 6,819 Bytes
8899cc8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
# MCP Integration for Wrdler

## Overview

Wrdler exposes AI word generation functionality as an **MCP (Model Context Protocol) tool** when running locally. This allows AI assistants and other MCP clients to generate vocabulary words for custom topics.

## What is MCP?

The Model Context Protocol (MCP) is a standard for integrating AI assistants with external tools and data sources. Gradio 5.0+ has built-in MCP server support, making it easy to expose functions as MCP tools.

**Reference:** [Building MCP Server with Gradio](https://www.gradio.app/guides/building-mcp-server-with-gradio)

## Available MCP Tools

### `generate_ai_words`

Generate 75 AI-selected words (25 each of lengths 4, 5, 6) related to a specific topic.

**Availability:** Only when running locally with `USE_HF_WORDS=false`

#### Input Parameters

```json
{
  "topic": "Ocean Life",          // Required: Theme for word generation
  "model_name": null,              // Optional: Override default AI model
  "seed": null,                    // Optional: Random seed for reproducibility
  "use_dictionary_filter": true,  // Optional: Filter against dictionary (legacy parameter)
  "selected_file": null            // Optional: Word list file for dictionary context
}
```

#### Output Format

```json
{
  "words": [
    "WAVE", "TIDE", "FISH", ...    // 75 words total (25 � 4-letter, 25 � 5-letter, 25 � 6-letter)
  ],
  "difficulties": {
    "WAVE": 0.45,                  // Difficulty score for each word
    "TIDE": 0.32,
    ...
  },
  "metadata": {
    "model_used": "microsoft/Phi-3-mini-4k-instruct",
    "transformers_available": "True",
    "gradio_client_available": "True",
    "use_hf_words": "False",
    "raw_output_length": "2048",
    "raw_output_snippet": "...",
    "ai_initial_count": "75",
    "topic": "Ocean Life",
    "dictionary_filter": "True",
    "new_words_saved": "15"
  }
}
```

## Setup

### 1. Environment Configuration

Set the `USE_HF_WORDS` environment variable to enable local mode:

```bash
# Linux/Mac
export USE_HF_WORDS=false

# Windows (PowerShell)
$env:USE_HF_WORDS="false"

# .env file
USE_HF_WORDS=false
```

### 2. Run Gradio App

```bash
python gradio_app.py
```

You should see:

```
===========================================================================
?? MCP SERVER ENABLED (Local Mode)
===========================================================================
MCP tools available:
  - generate_ai_words: Generate AI vocabulary words for topics

To use MCP tools, connect your MCP client to this Gradio app.
See: https://www.gradio.app/guides/building-mcp-server-with-gradio
===========================================================================
```

### 3. Connect MCP Client

Configure your MCP client to connect to the Gradio server:

```json
{
  "mcpServers": {
    "wrdler": {
      "url": "http://localhost:7860",
      "transport": "gradio"
    }
  }
}
```

## Usage Examples

### Example 1: Basic Topic Generation

**Input:**
```json
{
  "topic": "Space Exploration"
}
```

**Output:**
```json
{
  "words": [
    "STAR", "MARS", "MOON", "SHIP", "ORBIT", "COMET", ...
  ],
  "difficulties": {
    "STAR": 0.25,
    "MARS": 0.30,
    ...
  },
  "metadata": {
    "model_used": "microsoft/Phi-3-mini-4k-instruct",
    "topic": "Space Exploration",
    "ai_initial_count": "75",
    ...
  }
}
```

### Example 2: With Custom Model and Seed

**Input:**
```json
{
  "topic": "Medieval History",
  "model_name": "meta-llama/Llama-3.1-8B-Instruct",
  "seed": 42
}
```

### Example 3: Using MCP via Claude Desktop

If you have Claude Desktop configured with MCP:

1. Add Wrdler to your MCP configuration
2. In Claude, use natural language:

```
Can you generate vocabulary words about Ancient Rome using the generate_ai_words tool?
```

Claude will automatically call the MCP tool and return the results.

## Technical Details

### Implementation

The MCP integration is implemented in `wrdler/word_loader_ai.py` using Gradio's `@gr.mcp_server_function` decorator:

```python
@gr.mcp_server_function(
    name="generate_ai_words",
    description="Generate 75 AI-selected words...",
    input_schema={...},
    output_schema={...}
)
def mcp_generate_ai_words(...) -> dict:
    # Wrapper for generate_ai_words()
    ...
```

The Gradio app (`gradio_app.py`) enables the MCP server by setting `mcp_server=True` in the launch configuration:

```python
demo.launch(
    server_name="0.0.0.0",
    server_port=7860,
    mcp_server=True,  # Enable MCP server
    ...
)
```

### Conditional Registration

The MCP function is **only registered when**:
- ? Gradio is available
- ? `USE_HF_WORDS=false` (local mode)

When deployed to Hugging Face Spaces (`USE_HF_WORDS=true`), the MCP function is **not registered** to avoid conflicts with the remote API.

### Word Generation Pipeline

1. **AI Generation**: Use local transformers models or HF Space API
2. **Validation**: Filter words to lengths 4, 5, 6 (uppercase A-Z only)
3. **Distribution**: Ensure 25 words per length
4. **Difficulty Scoring**: Calculate word difficulty metrics
5. **File Saving**: Save new words to topic-based files
6. **Return**: Provide words, difficulties, and metadata

## Troubleshooting

### MCP Function Not Appearing

**Check 1: Environment Variable**
```bash
echo $USE_HF_WORDS  # Should be "false"
```

**Check 2: Gradio Logs**
```
? word_loader_ai module loaded (MCP functions may be registered)
? MCP server function 'generate_ai_words' registered (local mode)
```

**Check 3: Gradio Version**
```bash
pip show gradio  # Should be >= 5.0.0
```

### Model Loading Issues

If you see warnings about model loading:
```
?? Transformers not available; falling back to dictionary words.
```

Install transformers:
```bash
pip install transformers torch
```

### Port Conflicts

If port 7860 is in use, modify `gradio_app.py`:
```python
launch_kwargs = {
    "server_port": 7861,  # Change port
    ...
}
```

## Remote vs Local Mode

| Feature | Local Mode (`USE_HF_WORDS=false`) | Remote Mode (`USE_HF_WORDS=true`) |
|---------|-----------------------------------|-----------------------------------|
| MCP Server | ? Enabled | ? Disabled |
| AI Models | Local transformers | HF Space API |
| Word Saving | ? Saves to files | ? Saves to files |
| Best For | Development, MCP clients | Production deployment |

## Security Notes

- MCP tools run with **full local file system access**
- Only enable MCP server in **trusted environments**
- Generated words are saved to `wrdler/words/` directory
- Model weights are cached in `TMPDIR/hf-cache/`

## Further Reading

- [Gradio MCP Guide](https://www.gradio.app/guides/building-mcp-server-with-gradio)
- [MCP Specification](https://modelcontextprotocol.io/)
- [Wrdler Requirements](../specs/requirements.md)

---

**Last Updated:** 2025-11-30
**Version:** 0.1.5