Browse Source

Add OBJ wireframe model viewer and CLAUDE.md

Add support for uploading and displaying OBJ 3D models as rotating
wireframe graphics on the TV output. Includes OBJ parser with
auto-scaling, NVS persistence, HTTP endpoints for upload/clear/status,
web UI with zoom and X/Y/Z rotation sliders, and screen state 18
integration with the rotation system.

Also add CLAUDE.md with project build instructions and architecture notes.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
master
melancholytron 2 weeks ago
parent
commit
f1b18ff0a9
  1. 86
      CLAUDE.md
  2. 528
      main/user_main.c

86
CLAUDE.md

@ -0,0 +1,86 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
Channel3 is an ESP8266 firmware that broadcasts analog NTSC/PAL television signals. It modulates RF through GPIO3/RX at 80 MHz using the I2S bus with DMA, allowing an analog TV tuned to Channel 3 to display graphics, text, and 3D content.
## Build Commands
```bash
make # Build firmware (outputs image.elf-0x00000.bin)
make showvars # Debug: display all build variables
```
Git submodules auto-initialize on first build if missing.
## Configuration
Edit `user.cfg` for:
- `PORT` - Serial port for flashing (default: `/dev/ttyUSB0`)
- `OPTS += -DPAL` - Uncomment to enable PAL mode (default is NTSC)
- `FWBURNFLAGS` - Flash baud rate
## Architecture
### Video Signal Generation
The core innovation is using I2S DMA at 80 MHz to generate TV signals:
1. **Premodulation tables** (`tablemaker/broadcast_tables.c`) contain 1408-bit patterns per color, chosen as an exact harmonic of both NTSC chroma (3.579545 MHz) and Channel 3 luma (61.25 MHz)
2. **Line state machine** (`tablemaker/CbTable.c`) defines behavior for each scanline (263 lines NTSC, 313 lines PAL) - sync pulses, blanking, colorburst, active video
3. **DMA engine** (`user/video_broadcast.c`) fills buffers via interrupt on each line completion, using `CbTable` to select the appropriate line handler
### Framebuffer
- Double-buffered: 232x220 pixels (NTSC) or 232x264 (PAL)
- 4 bits per pixel (16 colors)
- Front/back buffer swapping on frame completion
### Key Source Files
- `user/video_broadcast.c` - DMA setup, interrupt handlers, modulation
- `user/3d.c` - Fixed-point 3D engine (256 = 1.0, 8-bit fractional)
- `user/user_main.c` - Demo screens, main loop, initialization
- `tablemaker/CbTable.c` - NTSC/PAL line type definitions
- `tablemaker/broadcast_tables.c` - Premodulated waveform lookup table
- `common/` - HTTP server, mDNS, WiFi, flash filesystem
### Web Interface
Connect to `http://192.168.4.1` when device is in SoftAP mode. The NTSC control panel allows:
- Screen selection and demo freeze
- Color jamming for RF testing
- Interactive JavaScript shader for custom color waveforms
- DFT visualization
## PAL vs NTSC
Controlled by `-DPAL` compile flag. PAL mode broadcasts PAL-compliant B/W timing with NTSC color encoding (NTSC50-like). The main differences are in `CbTable.c` line counts and timing.
## ESP32 Port (esp32_channel3)
The ESP32 port is located in `esp32_channel3/` directory.
### Building and Flashing
**Claude should run these commands directly** - do not ask the user to run them manually.
Build command (from bash):
```bash
/c/Windows/System32/WindowsPowerShell/v1.0/powershell.exe -ExecutionPolicy Bypass -Command "Set-Location 'C:\git\channel3\esp32_channel3'; .\build.ps1"
```
Flash command (COM5):
```bash
/c/Windows/System32/WindowsPowerShell/v1.0/powershell.exe -ExecutionPolicy Bypass -Command "Set-Location 'C:\git\channel3\esp32_channel3'; .\flash.ps1"
```
### Technical Notes
The ESP-IDF tools have MSYSTEM checks that block builds from MSYS2. These were patched in `C:\Espressif\frameworks\esp-idf-v5.5.2\tools\`:
- `idf.py` line ~914: Added `main()` call after MSYSTEM warning
- `idf_tools.py` line ~3600: Changed `fatal()` to `warn()` and removed `SystemExit`

528
main/user_main.c

@ -69,6 +69,21 @@ static httpd_handle_t http_server = NULL;
static uint8_t uploaded_image[IMG_BUFFER_SIZE];
static bool has_uploaded_image = false;
// OBJ model storage
#define MAX_OBJ_VERTICES 500
#define MAX_OBJ_EDGES 1000
static int16_t obj_vertices[MAX_OBJ_VERTICES * 3]; // x,y,z per vertex
static uint16_t obj_edges[MAX_OBJ_EDGES * 2]; // v1,v2 per edge
static uint16_t obj_vertex_count = 0;
static uint16_t obj_edge_count = 0;
static bool has_obj_model = false;
static int16_t obj_zoom = 500; // Z distance (100-1500)
static uint8_t obj_rot_x = 0; // X rotation (0-255)
static uint8_t obj_rot_y = 0; // Y rotation (0-255)
static uint8_t obj_rot_z = 0; // Z rotation (0-255)
static uint8_t obj_thickness = 1; // Line thickness (1-5)
// Video streaming server
#define STREAM_PORT 5000
static bool streaming_active = false;
@ -111,6 +126,10 @@ static int8_t margin_bottom = 0;
#define NVS_KEY_MARGIN_T "margin_t"
#define NVS_KEY_MARGIN_R "margin_r"
#define NVS_KEY_MARGIN_B "margin_b"
#define NVS_KEY_OBJ_VERTS "obj_verts"
#define NVS_KEY_OBJ_EDGES "obj_edges"
#define NVS_KEY_OBJ_VCNT "obj_vcnt"
#define NVS_KEY_OBJ_ECNT "obj_ecnt"
// MQTT Configuration (stored in NVS)
#define ALERT_DURATION_MS 5000
@ -202,6 +221,7 @@ static uint32_t last_ha_fetch = 0;
#define SCREEN_TYPE_CLOCK 1
#define SCREEN_TYPE_HA_SENSOR 2
#define SCREEN_TYPE_IMAGE 3
#define SCREEN_TYPE_OBJ_MODEL 4
typedef struct {
uint8_t screen_type; // 0=Weather, 1=Clock, 2=HA Sensor, 3=Image
@ -342,6 +362,280 @@ static void load_uploaded_image(void)
nvs_close(nvs);
}
/**
* @brief Save OBJ model to NVS
*/
static void save_obj_model(void)
{
nvs_handle_t nvs;
video_broadcast_pause();
esp_err_t err = nvs_open(NVS_NAMESPACE, NVS_READWRITE, &nvs);
if (err != ESP_OK) {
ESP_LOGE(TAG, "Failed to open NVS for OBJ save: %s", esp_err_to_name(err));
video_broadcast_resume();
return;
}
// Save vertex and edge counts
nvs_set_u16(nvs, NVS_KEY_OBJ_VCNT, obj_vertex_count);
nvs_set_u16(nvs, NVS_KEY_OBJ_ECNT, obj_edge_count);
// Save vertex data
size_t verts_size = obj_vertex_count * 3 * sizeof(int16_t);
err = nvs_set_blob(nvs, NVS_KEY_OBJ_VERTS, obj_vertices, verts_size);
if (err != ESP_OK) {
ESP_LOGE(TAG, "Failed to save OBJ vertices: %s", esp_err_to_name(err));
}
// Save edge data
size_t edges_size = obj_edge_count * 2 * sizeof(uint16_t);
err = nvs_set_blob(nvs, NVS_KEY_OBJ_EDGES, obj_edges, edges_size);
if (err != ESP_OK) {
ESP_LOGE(TAG, "Failed to save OBJ edges: %s", esp_err_to_name(err));
}
nvs_commit(nvs);
nvs_close(nvs);
video_broadcast_resume();
ESP_LOGI(TAG, "Saved OBJ model: %d vertices, %d edges", obj_vertex_count, obj_edge_count);
}
/**
* @brief Load OBJ model from NVS
*/
static void load_obj_model(void)
{
nvs_handle_t nvs;
esp_err_t err = nvs_open(NVS_NAMESPACE, NVS_READONLY, &nvs);
if (err != ESP_OK) {
return;
}
uint16_t vcnt = 0, ecnt = 0;
nvs_get_u16(nvs, NVS_KEY_OBJ_VCNT, &vcnt);
nvs_get_u16(nvs, NVS_KEY_OBJ_ECNT, &ecnt);
if (vcnt > 0 && vcnt <= MAX_OBJ_VERTICES && ecnt > 0 && ecnt <= MAX_OBJ_EDGES) {
size_t verts_size = vcnt * 3 * sizeof(int16_t);
size_t edges_size = ecnt * 2 * sizeof(uint16_t);
err = nvs_get_blob(nvs, NVS_KEY_OBJ_VERTS, obj_vertices, &verts_size);
if (err == ESP_OK) {
err = nvs_get_blob(nvs, NVS_KEY_OBJ_EDGES, obj_edges, &edges_size);
if (err == ESP_OK) {
obj_vertex_count = vcnt;
obj_edge_count = ecnt;
has_obj_model = true;
ESP_LOGI(TAG, "Loaded OBJ model: %d vertices, %d edges", vcnt, ecnt);
}
}
}
nvs_close(nvs);
}
/**
* @brief Clear OBJ model from memory and NVS
*/
static void clear_obj_model(void)
{
obj_vertex_count = 0;
obj_edge_count = 0;
has_obj_model = false;
nvs_handle_t nvs;
video_broadcast_pause();
if (nvs_open(NVS_NAMESPACE, NVS_READWRITE, &nvs) == ESP_OK) {
nvs_erase_key(nvs, NVS_KEY_OBJ_VERTS);
nvs_erase_key(nvs, NVS_KEY_OBJ_EDGES);
nvs_erase_key(nvs, NVS_KEY_OBJ_VCNT);
nvs_erase_key(nvs, NVS_KEY_OBJ_ECNT);
nvs_commit(nvs);
nvs_close(nvs);
}
video_broadcast_resume();
ESP_LOGI(TAG, "Cleared OBJ model");
}
/**
* @brief Check if edge already exists (avoid duplicates)
*/
static bool edge_exists(uint16_t v1, uint16_t v2, uint16_t count)
{
for (uint16_t i = 0; i < count; i++) {
uint16_t e1 = obj_edges[i * 2];
uint16_t e2 = obj_edges[i * 2 + 1];
if ((e1 == v1 && e2 == v2) || (e1 == v2 && e2 == v1)) {
return true;
}
}
return false;
}
/**
* @brief Parse OBJ file data and populate vertex/edge arrays
*/
static bool parse_obj_data(const char *data, size_t len)
{
// Reset counts
obj_vertex_count = 0;
obj_edge_count = 0;
has_obj_model = false;
// First pass: count vertices and find bounds
float min_x = 1e9f, max_x = -1e9f;
float min_y = 1e9f, max_y = -1e9f;
float min_z = 1e9f, max_z = -1e9f;
// Temporary storage for float vertices (we'll convert after finding bounds)
float *temp_verts = malloc(MAX_OBJ_VERTICES * 3 * sizeof(float));
if (!temp_verts) {
ESP_LOGE(TAG, "Failed to allocate temp vertex buffer");
return false;
}
uint16_t vert_count = 0;
const char *p = data;
const char *end = data + len;
while (p < end) {
// Skip whitespace
while (p < end && (*p == ' ' || *p == '\t')) p++;
if (p >= end) break;
// Parse vertex line: "v x y z"
if (*p == 'v' && p + 1 < end && p[1] == ' ') {
if (vert_count >= MAX_OBJ_VERTICES) {
ESP_LOGW(TAG, "OBJ vertex limit reached (%d)", MAX_OBJ_VERTICES);
break;
}
p += 2; // Skip "v "
float x = 0, y = 0, z = 0;
// Parse x
while (p < end && (*p == ' ' || *p == '\t')) p++;
x = strtof(p, (char**)&p);
// Parse y
while (p < end && (*p == ' ' || *p == '\t')) p++;
y = strtof(p, (char**)&p);
// Parse z
while (p < end && (*p == ' ' || *p == '\t')) p++;
z = strtof(p, (char**)&p);
temp_verts[vert_count * 3 + 0] = x;
temp_verts[vert_count * 3 + 1] = y;
temp_verts[vert_count * 3 + 2] = z;
if (x < min_x) min_x = x;
if (x > max_x) max_x = x;
if (y < min_y) min_y = y;
if (y > max_y) max_y = y;
if (z < min_z) min_z = z;
if (z > max_z) max_z = z;
vert_count++;
}
// Skip to end of line
while (p < end && *p != '\n' && *p != '\r') p++;
while (p < end && (*p == '\n' || *p == '\r')) p++;
}
if (vert_count == 0) {
ESP_LOGE(TAG, "No vertices found in OBJ");
free(temp_verts);
return false;
}
// Calculate scale to fit in [-200, 200] range
float width = max_x - min_x;
float height = max_y - min_y;
float depth = max_z - min_z;
float max_dim = width > height ? width : height;
if (depth > max_dim) max_dim = depth;
float scale = (max_dim > 0) ? (400.0f / max_dim) : 1.0f;
// Center offsets
float cx = (min_x + max_x) / 2.0f;
float cy = (min_y + max_y) / 2.0f;
float cz = (min_z + max_z) / 2.0f;
// Convert to fixed-point centered vertices
for (uint16_t i = 0; i < vert_count; i++) {
obj_vertices[i * 3 + 0] = (int16_t)((temp_verts[i * 3 + 0] - cx) * scale);
obj_vertices[i * 3 + 1] = (int16_t)((temp_verts[i * 3 + 1] - cy) * scale);
obj_vertices[i * 3 + 2] = (int16_t)((temp_verts[i * 3 + 2] - cz) * scale);
}
free(temp_verts);
obj_vertex_count = vert_count;
// Second pass: parse faces and extract edges
p = data;
uint16_t edge_count = 0;
while (p < end) {
// Skip whitespace
while (p < end && (*p == ' ' || *p == '\t')) p++;
if (p >= end) break;
// Parse face line: "f v1 v2 v3 ..." or "f v1/vt1/vn1 v2/vt2/vn2 ..."
if (*p == 'f' && p + 1 < end && (p[1] == ' ' || p[1] == '\t')) {
p += 2; // Skip "f "
uint16_t face_verts[16];
int face_vert_count = 0;
while (p < end && *p != '\n' && *p != '\r' && face_vert_count < 16) {
// Skip whitespace
while (p < end && (*p == ' ' || *p == '\t')) p++;
if (p >= end || *p == '\n' || *p == '\r') break;
// Parse vertex index (1-based in OBJ)
int v = strtol(p, (char**)&p, 10);
if (v > 0 && v <= vert_count) {
face_verts[face_vert_count++] = (uint16_t)(v - 1); // Convert to 0-based
}
// Skip texture/normal indices (e.g., "/vt/vn")
while (p < end && *p != ' ' && *p != '\t' && *p != '\n' && *p != '\r') p++;
}
// Create edges from face (connect consecutive vertices + last to first)
for (int i = 0; i < face_vert_count && edge_count < MAX_OBJ_EDGES; i++) {
uint16_t v1 = face_verts[i];
uint16_t v2 = face_verts[(i + 1) % face_vert_count];
// Check for duplicate edges
if (!edge_exists(v1, v2, edge_count)) {
obj_edges[edge_count * 2 + 0] = v1;
obj_edges[edge_count * 2 + 1] = v2;
edge_count++;
}
}
}
// Skip to end of line
while (p < end && *p != '\n' && *p != '\r') p++;
while (p < end && (*p == '\n' || *p == '\r')) p++;
}
obj_edge_count = edge_count;
has_obj_model = (vert_count > 0 && edge_count > 0);
ESP_LOGI(TAG, "Parsed OBJ: %d vertices, %d edges, scale=%.2f",
vert_count, edge_count, scale);
return has_obj_model;
}
/**
* @brief Load MQTT configuration from NVS
*/
@ -618,6 +912,7 @@ static int rotation_type_to_state(uint8_t screen_type)
case SCREEN_TYPE_CLOCK: return 16;
case SCREEN_TYPE_HA_SENSOR: return 17;
case SCREEN_TYPE_IMAGE: return 12;
case SCREEN_TYPE_OBJ_MODEL: return 18;
default: return 13;
}
}
@ -1288,6 +1583,21 @@ static const char *html_page =
"<button class='btn' onclick='uploadImage()'>UPLOAD</button></div>"
"<div style='margin-top:8px'><canvas id='preview' width='116' height='220'></canvas></div>"
"<div id='imgStatus' class='msg'></div></div></div>"
"<div><h2>> 3D_MODEL</h2><div class='panel'>"
"<div class='flex'><input type='file' id='objFile' accept='.obj' style='width:150px'>"
"<button class='btn' onclick='uploadObj()'>UPLOAD</button>"
"<button class='btn' onclick='clearObj()'>CLEAR</button></div>"
"<div id='objStatus' class='msg'>No model loaded</div>"
"<div class='sep'><b>View:</b></div>"
"<div class='flex'><label>Zoom:<input type='range' id='objZoom' min='100' max='1500' value='500' style='width:80px' oninput='setObjView()'></label>"
"<span id='objZoomVal'>500</span></div>"
"<div class='flex'><label>X:<input type='range' id='objRotX' min='0' max='255' value='0' style='width:60px' oninput='setObjView()'></label>"
"<span id='objRotXVal'>0</span>"
"<label>Y:<input type='range' id='objRotY' min='0' max='255' value='0' style='width:60px' oninput='setObjView()'></label>"
"<span id='objRotYVal'>0</span>"
"<label>Z:<input type='range' id='objRotZ' min='0' max='255' value='0' style='width:60px' oninput='setObjView()'></label>"
"<span id='objRotZVal'>0</span></div>"
"</div></div>"
"<div><h2>> DEMO_SCREENS</h2><div class='panel'><div class='flex'>"
"<button class='btn' onclick='quickScreen(0)'>STATUS</button>"
"<button class='btn' onclick='quickScreen(2)'>SYSINFO</button>"
@ -1297,10 +1607,11 @@ static const char *html_page =
"<button class='btn' onclick='quickScreen(11)'>COLORS</button>"
"<button class='btn' onclick='quickScreen(12)'>IMAGE</button>"
"<button class='btn' onclick='quickScreen(13)'>WEATHER</button>"
"<button class='btn' onclick='quickScreen(18)'>3D MODEL</button>"
"</div></div></div>"
"<div class='full'><h2>> ROTATION</h2><div class='panel'>"
"<div id='rotationList' style='margin-bottom:8px'></div>"
"<div class='flex'><label>Add:<select id='rotType'><option value='0'>Weather</option><option value='1'>Clock</option><option value='2'>HA Sensor</option><option value='3'>Image</option></select></label>"
"<div class='flex'><label>Add:<select id='rotType'><option value='0'>Weather</option><option value='1'>Clock</option><option value='2'>HA Sensor</option><option value='3'>Image</option><option value='4'>3D Model</option></select></label>"
"<select id='rotSensor' style='display:none;width:80px'></select>"
"<label>Dur:<input id='rotDur' type='number' value='15' style='width:45px'>s</label>"
"<button class='btn' onclick='addRotSlot()'>ADD</button></div>"
@ -1394,6 +1705,26 @@ static const char *html_page =
"document.getElementById('imgStatus').innerText=t+' (add to rotation or click IMAGE)';"
"}).catch(e=>{document.getElementById('imgStatus').innerText='Error: '+e;});}"
"function findColor(r,g,b){for(var i=0;i<16;i++)if(pal[i][0]==r&&pal[i][1]==g&&pal[i][2]==b)return i;return 0;}"
"function loadObjStatus(){fetch('/obj/status').then(r=>r.json()).then(d=>{"
"document.getElementById('objStatus').innerText=d.loaded?(d.vertices+' vertices, '+d.edges+' edges'):'No model loaded';"
"document.getElementById('objZoom').value=d.zoom;document.getElementById('objZoomVal').innerText=d.zoom;"
"document.getElementById('objRotX').value=d.rx;document.getElementById('objRotXVal').innerText=d.rx;"
"document.getElementById('objRotY').value=d.ry;document.getElementById('objRotYVal').innerText=d.ry;"
"document.getElementById('objRotZ').value=d.rz;document.getElementById('objRotZVal').innerText=d.rz;"
"}).catch(e=>console.log(e));}"
"function setObjView(){var z=document.getElementById('objZoom').value;"
"var rx=document.getElementById('objRotX').value,ry=document.getElementById('objRotY').value,rz=document.getElementById('objRotZ').value;"
"document.getElementById('objZoomVal').innerText=z;"
"document.getElementById('objRotXVal').innerText=rx;document.getElementById('objRotYVal').innerText=ry;document.getElementById('objRotZVal').innerText=rz;"
"fetch('/obj/settings?zoom='+z+'&rx='+rx+'&ry='+ry+'&rz='+rz);}"
"function uploadObj(){var f=document.getElementById('objFile').files[0];"
"if(!f){alert('Select an OBJ file');return;}"
"document.getElementById('objStatus').innerText='Uploading...';"
"var reader=new FileReader();reader.onload=function(e){"
"fetch('/obj/upload',{method:'POST',body:e.target.result}).then(r=>r.text()).then(t=>{"
"document.getElementById('objStatus').innerText=t;loadObjStatus();}).catch(e=>{"
"document.getElementById('objStatus').innerText='Error: '+e;});};reader.readAsText(f);}"
"function clearObj(){fetch('/obj/clear').then(()=>loadObjStatus());}"
"var marginsLoaded=false;"
"function updateStatus(){fetch('/status').then(r=>r.json()).then(d=>{"
"document.getElementById('status').innerHTML='> FRAME: '+d.frame+' | SCREEN: '+d.screen+"
@ -1480,7 +1811,7 @@ static const char *html_page =
"var rotConfig={slots:[]};"
"function loadRotation(){fetch('/rotation/status').then(r=>r.json()).then(d=>{"
"rotConfig=d;renderRotation();}).catch(e=>console.log(e));}"
"function renderRotation(){var h='';var types=['Weather','Clock','HA Sensor','Image'];"
"function renderRotation(){var h='';var types=['Weather','Clock','HA Sensor','Image','3D Model'];"
"for(var i=0;i<rotConfig.count;i++){var s=rotConfig.slots[i];if(!s)continue;"
"var name=types[s.type]||'Unknown';if(s.type==2&&haConfig.sensors[s.sensor_idx])"
"name+=' ('+haConfig.sensors[s.sensor_idx].name+')';"
@ -1536,7 +1867,7 @@ static const char *html_page =
"if(r.ok){document.getElementById('otaStatus').innerText='Update complete! Rebooting...';}"
"else{r.text().then(t=>{document.getElementById('otaStatus').innerText='Update failed: '+t;});}}"
").catch(e=>{document.getElementById('otaStatus').innerText='Upload failed: '+e;});}"
"updateStatus();loadMqtt();loadHaConfig();loadRotation();loadTransition();loadOtaStatus();setInterval(updateStatus,2000);setInterval(loadHaConfig,10000);"
"updateStatus();loadMqtt();loadHaConfig();loadRotation();loadTransition();loadOtaStatus();loadObjStatus();setInterval(updateStatus,2000);setInterval(loadHaConfig,10000);"
"</script></body></html>";
/**
@ -1793,6 +2124,117 @@ static esp_err_t upload_handler(httpd_req_t *req)
return ESP_OK;
}
/**
* @brief Handler for POST /obj/upload - Upload OBJ file
*/
static esp_err_t obj_upload_handler(httpd_req_t *req)
{
// Limit to reasonable size (64KB max)
if (req->content_len > 65536) {
httpd_resp_send_err(req, HTTPD_400_BAD_REQUEST, "File too large (max 64KB)");
return ESP_FAIL;
}
// Allocate temp buffer
char *buf = malloc(req->content_len + 1);
if (!buf) {
httpd_resp_send_err(req, HTTPD_500_INTERNAL_SERVER_ERROR, "Out of memory");
return ESP_FAIL;
}
// Receive OBJ text
int received = 0;
while (received < req->content_len) {
int ret = httpd_req_recv(req, buf + received, req->content_len - received);
if (ret <= 0) {
if (ret == HTTPD_SOCK_ERR_TIMEOUT) continue;
free(buf);
httpd_resp_send_err(req, HTTPD_500_INTERNAL_SERVER_ERROR, "Receive failed");
return ESP_FAIL;
}
received += ret;
}
buf[received] = '\0';
// Parse OBJ
if (parse_obj_data(buf, received)) {
save_obj_model(); // Persist to NVS
char resp[64];
snprintf(resp, sizeof(resp), "Loaded: %d vertices, %d edges",
obj_vertex_count, obj_edge_count);
httpd_resp_send(req, resp, -1);
} else {
httpd_resp_send_err(req, HTTPD_400_BAD_REQUEST, "Parse failed");
}
free(buf);
return ESP_OK;
}
/**
* @brief Handler for GET /obj/status - Return OBJ model status as JSON
*/
static esp_err_t obj_status_handler(httpd_req_t *req)
{
char response[192];
snprintf(response, sizeof(response),
"{\"loaded\":%s,\"vertices\":%d,\"edges\":%d,\"zoom\":%d,\"rx\":%d,\"ry\":%d,\"rz\":%d,\"thick\":%d}",
has_obj_model ? "true" : "false",
obj_vertex_count, obj_edge_count, obj_zoom, obj_rot_x, obj_rot_y, obj_rot_z, obj_thickness);
httpd_resp_set_type(req, "application/json");
httpd_resp_send(req, response, strlen(response));
return ESP_OK;
}
/**
* @brief Handler for GET /obj/clear - Clear OBJ model
*/
static esp_err_t obj_clear_handler(httpd_req_t *req)
{
clear_obj_model();
httpd_resp_send(req, "Model cleared", -1);
return ESP_OK;
}
/**
* @brief Handler for GET /obj/settings - Set zoom and rotation
*/
static esp_err_t obj_settings_handler(httpd_req_t *req)
{
char buf[128];
char param[16];
if (httpd_req_get_url_query_str(req, buf, sizeof(buf)) == ESP_OK) {
if (httpd_query_key_value(buf, "zoom", param, sizeof(param)) == ESP_OK) {
int z = atoi(param);
if (z >= 100 && z <= 1500) obj_zoom = z;
}
if (httpd_query_key_value(buf, "rx", param, sizeof(param)) == ESP_OK) {
int r = atoi(param);
if (r >= 0 && r <= 255) obj_rot_x = r;
}
if (httpd_query_key_value(buf, "ry", param, sizeof(param)) == ESP_OK) {
int r = atoi(param);
if (r >= 0 && r <= 255) obj_rot_y = r;
}
if (httpd_query_key_value(buf, "rz", param, sizeof(param)) == ESP_OK) {
int r = atoi(param);
if (r >= 0 && r <= 255) obj_rot_z = r;
}
if (httpd_query_key_value(buf, "thick", param, sizeof(param)) == ESP_OK) {
int t = atoi(param);
if (t >= 1 && t <= 5) obj_thickness = t;
}
}
char response[128];
snprintf(response, sizeof(response), "{\"zoom\":%d,\"rx\":%d,\"ry\":%d,\"rz\":%d,\"thick\":%d}", obj_zoom, obj_rot_x, obj_rot_y, obj_rot_z, obj_thickness);
httpd_resp_set_type(req, "application/json");
httpd_resp_send(req, response, strlen(response));
return ESP_OK;
}
/**
* @brief Handler for GET /mqtt/status - Return current MQTT config as JSON
*/
@ -3031,6 +3473,35 @@ static void start_webserver(void)
};
httpd_register_uri_handler(http_server, &upload_uri);
// OBJ model endpoints
httpd_uri_t obj_upload_uri = {
.uri = "/obj/upload",
.method = HTTP_POST,
.handler = obj_upload_handler
};
httpd_register_uri_handler(http_server, &obj_upload_uri);
httpd_uri_t obj_status_uri = {
.uri = "/obj/status",
.method = HTTP_GET,
.handler = obj_status_handler
};
httpd_register_uri_handler(http_server, &obj_status_uri);
httpd_uri_t obj_clear_uri = {
.uri = "/obj/clear",
.method = HTTP_GET,
.handler = obj_clear_handler
};
httpd_register_uri_handler(http_server, &obj_clear_uri);
httpd_uri_t obj_settings_uri = {
.uri = "/obj/settings",
.method = HTTP_GET,
.handler = obj_settings_handler
};
httpd_register_uri_handler(http_server, &obj_settings_uri);
httpd_uri_t mqtt_status_uri = {
.uri = "/mqtt/status",
.method = HTTP_GET,
@ -3799,6 +4270,54 @@ static void DrawFrame(void)
break;
}
case 18: // OBJ Model wireframe
{
if (has_obj_model) {
CNFGColor(15); // White wireframe
SetupMatrix();
// Apply manual rotation
tdRotateEA(ModelviewMatrix, obj_rot_x, obj_rot_y, obj_rot_z);
// Push model back from camera (Z distance)
ModelviewMatrix[11] = obj_zoom;
// Draw all edges (1 pixel thin)
for (int e = 0; e < obj_edge_count; e++) {
int16_t *v1 = &obj_vertices[obj_edges[e * 2] * 3];
int16_t *v2 = &obj_vertices[obj_edges[e * 2 + 1] * 3];
Draw3DSegment(v1, v2);
}
// Display vertex/edge count at bottom
char info[32];
snprintf(info, sizeof(info), "%dv %de", obj_vertex_count, obj_edge_count);
CNFGColor(8); // Gray
CNFGPenX = 2 + margin_left;
CNFGPenY = 200 + margin_top;
CNFGDrawText(info, 1);
} else {
// No model loaded - show message
CNFGColor(7);
int msg_width = 8 * 3 * 2; // "NO MODEL" width
CNFGPenX = (FBW2 - msg_width) / 2;
CNFGPenY = 100 + margin_top;
CNFGDrawText("NO MODEL", 2);
CNFGColor(8);
msg_width = 18 * 3 * 1; // "Upload via web UI" width
CNFGPenX = (FBW2 - msg_width) / 2;
CNFGPenY = 130 + margin_top;
CNFGDrawText("Upload via web UI", 1);
}
// Transition when rotation duration expires
if (rotation_duration_expired()) {
newstate = advance_rotation();
}
break;
}
case 13: // Weather display - 3 pages: current, forecast 1-3, forecast 4-6
{
char weather_text[64];
@ -4344,6 +4863,9 @@ void app_main(void)
// Load uploaded image from NVS if available
load_uploaded_image();
// Load OBJ model from NVS if available
load_obj_model();
// Initialize WiFi
#ifdef CONFIG_WIFI_MODE_STATION
wifi_init_station();

Loading…
Cancel
Save