{"id":1421,"date":"2025-11-09T11:01:22","date_gmt":"2025-11-09T11:01:22","guid":{"rendered":"https:\/\/imalogic.com\/blog\/?p=1421"},"modified":"2025-11-09T11:07:37","modified_gmt":"2025-11-09T11:07:37","slug":"ai-enhanced-3d-pipeline","status":"publish","type":"post","link":"https:\/\/imalogic.com\/blog\/2025\/11\/09\/ai-enhanced-3d-pipeline\/","title":{"rendered":"AI-Enhanced 3D Pipeline"},"content":{"rendered":"<body>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><strong>Render-Conditioned Diffusion and Hybrid Neural Rendering: From Simple Prototype to Advanced 3D Pipeline<\/strong><\/p>\n<\/blockquote>\n\n\n\n<h2 class=\"wp-block-heading\">Introduction<\/h2>\n\n\n\n<p>Traditional 3D rendering pipelines ensure <strong>geometric accuracy and material consistency<\/strong>, but they can be limited when aiming to explore <strong>artistic styles, photorealism, or complex visual effects<\/strong>.<\/p>\n\n\n\n<p>Generative AI and diffusion models allow for <strong>controlled transformation, stylization, and enhancement<\/strong> of images. The concept of <strong>render-conditioned diffusion<\/strong> or <strong>hybrid neural rendering<\/strong> combines the <strong>rigor of 3D engines<\/strong> with the <strong>creativity of neural networks<\/strong>.<\/p>\n\n\n\n<p>This article first presents a <strong>simple approach<\/strong> and then generalizes to an <strong>advanced pipeline<\/strong> incorporating depth maps, normal maps, motion vectors, and pre\/post-processing modules.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h2 class=\"wp-block-heading\">1. Simple Approach: Pilot 3D Render + AI Stylization<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">1.1 Concept<\/h3>\n\n\n\n<p>In the minimal version, each storyboard panel (\u201ccase\u201d) follows this pipeline:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code><strong>Storyboard (text description) \u2192 GPT Agent \u2192 Lua Script + 3D Assets \u2192 3D Engine \u2192 Pilot Frame \u2192 AI img2img \u2192 Final Frame<\/strong>\n<\/code><\/pre>\n\n\n\n<p><strong>Storyboard<\/strong>: a set of images or textual descriptions defining the scene.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>GPT Agent<\/strong>: generates or modifies a Lua script to drive the 3D engine.<\/li>\n\n\n\n<li><strong>3D Engine<\/strong>: executes the script and produces a pilot frame.<\/li>\n\n\n\n<li><strong>AI img2img<\/strong>: stylizes or enhances the frame based on a text prompt.<\/li>\n<\/ul>\n\n\n\n<p>This approach allows <strong>rapid prototyping<\/strong> without requiring depth maps or motion vectors.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">1.2 Pre-processing (optional even for simple pipeline)<\/h3>\n\n\n\n<p>Before sending the frame to the AI:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Standardize format, resolution, and channels.<\/li>\n\n\n\n<li>Correct any artifacts or unwanted elements from the 3D engine.<\/li>\n\n\n\n<li>Crop or align the frame for scene coherence.<\/li>\n<\/ul>\n\n\n\n<p><strong>Benefits<\/strong>: improved AI fidelity and reduced hallucinations.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">1.3 AI Stylization Example<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code>final_frame = stylize_frame(\"frame_001.png\", \"cinematic sci-fi, soft lighting\")\nfinal_frame.save(\"frame_001_final.png\")\n<\/code><\/pre>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Text prompt<\/strong> guides style and ambiance.<\/li>\n\n\n\n<li><strong>Fixed seed<\/strong> ensures reproducibility.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">1.4 Post-processing<\/h3>\n\n\n\n<p>After AI stylization:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Partial blending with the original frame to preserve geometry.<\/li>\n\n\n\n<li>Artifact cleanup, color and contrast adjustment.<\/li>\n\n\n\n<li>Optional: smoothing or temporal interpolation for short animations.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">1.5 Advantages and Limitations<\/h3>\n\n\n\n<p><strong>Advantages<\/strong>:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Fast, modular prototype.<\/li>\n\n\n\n<li>Full automation from storyboard \u2192 render \u2192 stylization.<\/li>\n\n\n\n<li>Maximum artistic flexibility.<\/li>\n<\/ul>\n\n\n\n<p><strong>Limitations<\/strong>:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Limited temporal coherence for animations.<\/li>\n\n\n\n<li>Possible hallucinations in geometry.<\/li>\n\n\n\n<li>Dependent on the quality of Lua scripts generated by the agent.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h2 class=\"wp-block-heading\">2. Advanced Pipeline: Render-Conditioned Diffusion<\/h2>\n\n\n\n<p>For professional productions, additional <strong>geometric and temporal constraints<\/strong> improve quality.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">2.1 Additional Inputs<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Depth map<\/strong>: object distances for geometric fidelity.<\/li>\n\n\n\n<li><strong>Normal map<\/strong>: surface orientations for coherent stylization.<\/li>\n\n\n\n<li><strong>Motion vectors<\/strong>: smooth interpolation between frames.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">2.2 Full Pipeline<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code>Storyboard \u2192 GPT Agent \u2192 Lua Script + 3D Assets \u2192 3D Engine \u2192 Frame + Depth\/Normal\/Motion \u2192 Pre-processing \u2192 AI Render-Conditioned \u2192 Post-processing \u2192 Final Frame\n<\/code><\/pre>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Pre-processing<\/strong>: standardization, geometric correction, masks\/segmentation.<\/li>\n\n\n\n<li><strong>Render-conditioned AI<\/strong>: ControlNet + diffusion produces stylized images respecting geometry.<\/li>\n\n\n\n<li><strong>Post-processing<\/strong>: artifact cleanup, blending, interpolation, upscaling.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">2.3 Technical Example: ControlNet + Depth<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code>from diffusers import StableDiffusionControlNetPipeline, ControlNetModel\nfrom PIL import Image\nimport torch\n\ncontrolnet = ControlNetModel.from_pretrained(\"lllyasviel\/sd-controlnet-depth\")\npipe = StableDiffusionControlNetPipeline.from_pretrained(\n    \"stabilityai\/stable-diffusion-xl-base-1.0\",\n    controlnet=controlnet,\n    torch_dtype=torch.float16\n).to(\"cuda\")\n\nimg = Image.open(\"frame_001.png\")\ndepth = Image.open(\"depth_001.png\")\n\nresult = pipe(\n    prompt=\"ultra realistic sci-fi lighting, cinematic tone, same geometry\",\n    image=img,\n    control_image=depth,\n    strength=0.35,\n    guidance_scale=7.5,\n    generator=torch.Generator(\"cuda\").manual_seed(12345)\n)\nresult.images[0].save(\"frame_001_final.png\")\n<\/code><\/pre>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">2.4 Advantages<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>High geometric fidelity.<\/li>\n\n\n\n<li>Temporal coherence for animation.<\/li>\n\n\n\n<li>Full control over style and mood.<\/li>\n\n\n\n<li>Near elimination of AI hallucinations.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h2 class=\"wp-block-heading\">3. Case-by-Case Storyboard Integration<\/h2>\n\n\n\n<p>Each storyboard panel can be processed independently:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code><strong>Text Description + Lua Script + 3D Assets \u2192 3D Engine \u2192 Pilot Frame \u2192 Pre-processing \u2192 AI \u2192 Post-processing \u2192 Final Image<\/strong>\n<\/code><\/pre>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Modular workflow.<\/li>\n\n\n\n<li>Each case stylized according to its specific prompt.<\/li>\n\n\n\n<li>Assemble into final storyboard or animation sequence.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h2 class=\"wp-block-heading\">4. Pre-processing and Post-processing Modules: Importance<\/h2>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Module<\/th><th>Purpose<\/th><th>Benefit<\/th><\/tr><\/thead><tbody><tr><td>Pre-processing<\/td><td>Clean and standardize pilot frame<\/td><td>Reduces AI hallucinations, ensures exploitable input<\/td><\/tr><tr><td>Post-processing<\/td><td>Correct AI output, blend with original frame, interpolate<\/td><td>Geometric coherence, uniform style, smooth animation<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>Even in the simple pipeline, these modules <strong>significantly improve quality<\/strong> and make the pipeline robust.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h2 class=\"wp-block-heading\">5. Halloween Demo: Porcelain Doll in 3D &amp; AI-Generated Animation<\/h2>\n\n\n\n<p>To showcase the pipeline in action, I created a small <strong>Halloween demo<\/strong> featuring a <strong>porcelain doll<\/strong>. The demo was made as a PC demo\/demoscene project, and AI was used to generate stylized video sequences from 3D engine frames.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">5.1 3D Engine Screenshots<\/h3>\n\n\n\n<p><strong>Porcelain Doll \u2013 Base Frame<\/strong><br><img data-recalc-dims=\"1\" decoding=\"async\" width=\"640\" height=\"360\" data-attachment-id=\"1422\" data-permalink=\"https:\/\/imalogic.com\/blog\/2025\/11\/09\/ai-enhanced-3d-pipeline\/poupee\/\" data-orig-file=\"https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?fit=1920%2C1080&amp;ssl=1\" data-orig-size=\"1920,1080\" data-comments-opened=\"0\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"poupee\" data-image-description=\"\" data-image-caption=\"\" data-large-file=\"https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?fit=810%2C456&amp;ssl=1\" class=\"wp-image-1422\" style=\"width: 640px;\" src=\"https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?resize=640%2C360&#038;ssl=1\" alt=\"\" loading=\"lazy\" srcset=\"https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?w=1920&amp;ssl=1 1920w, https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?resize=300%2C169&amp;ssl=1 300w, https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?resize=1024%2C576&amp;ssl=1 1024w, https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?resize=768%2C432&amp;ssl=1 768w, https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?resize=1536%2C864&amp;ssl=1 1536w, https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?w=1620&amp;ssl=1 1620w\" sizes=\"auto, (max-width: 640px) 100vw, 640px\" \/><\/p>\n\n\n\n<p><em>Base frame rendered in the 3D engine showing the doll, lighting, and scene composition.<\/em><\/p>\n\n\n\n<p><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">5.2 AI-Generated Animation<\/h3>\n\n\n\n<p>The base frames from my custom 3D engine were fed into an <strong>AI video generation tool<\/strong>, producing <strong>two stylized animations<\/strong>:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code><strong>3D Engine Base Frames \u2192 Pre-processing \u2192 AI Video Generator \u2192 Stylized Animation<\/strong>\n<\/code><\/pre>\n\n\n\n<p><\/p>\n\n\n\n<p><strong>Animation 1 \u2013 Doll in Creepy Lighting<\/strong> <\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"810\" height=\"456\" src=\"https:\/\/www.youtube.com\/embed\/cscnj_r02GM?version=3&amp;rel=1&amp;showsearch=0&amp;showinfo=1&amp;iv_load_policy=1&amp;fs=1&amp;hl=en-US&amp;autohide=2&amp;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<p><strong>Animation 2 \u2013 Doll with Cinematic Halloween Mood<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"810\" height=\"456\" src=\"https:\/\/www.youtube.com\/embed\/hmUvRBUcRTA?version=3&amp;rel=1&amp;showsearch=0&amp;showinfo=1&amp;iv_load_policy=1&amp;fs=1&amp;hl=en-US&amp;autohide=2&amp;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\">\n\n\n\n<h3 class=\"wp-block-heading\">5.3 Key Takeaways<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Geometry preserved<\/strong>: The AI respects the doll\u2019s structure and position from the 3D base frames.<\/li>\n\n\n\n<li><strong>Style consistency maintained<\/strong>: The animation preserves the original real-time demo look, ensuring it integrates seamlessly with the rest of the demo.<\/li>\n\n\n\n<li><strong>Rapid iteration<\/strong>: Scene-by-scene control allowed testing multiple sequences quickly.<\/li>\n\n\n\n<li><strong>Demo-ready content<\/strong>: This workflow produces both screenshots and full video sequences that match the real-time demo style.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">6. Full Demo Showcase<\/h2>\n\n\n\n<p>After processing individual scenes and animations, the <strong>final Halloween demo<\/strong> integrates all elements seamlessly:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>3D engine base scenes<\/strong>: All objects, cameras, and lighting are rendered in real-time.<\/li>\n\n\n\n<li><strong>AI-generated sequences<\/strong>: Animations of the porcelain doll and other interactive elements are integrated to enhance cinematic feel without breaking the style.<\/li>\n\n\n\n<li><strong>Cohesive composition<\/strong>: The AI outputs were aligned with the original demo style, ensuring consistency across all scenes.<\/li>\n\n\n\n<li><strong>Real-time interaction<\/strong>: The final demo runs smoothly as a PC demo\/demoscene, with both scripted events and AI-enhanced animation sequences.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Example Video: Full Demo<\/h3>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"810\" height=\"456\" src=\"https:\/\/www.youtube.com\/embed\/n-6kriY-xpw?version=3&amp;rel=1&amp;showsearch=0&amp;showinfo=1&amp;iv_load_policy=1&amp;fs=1&amp;hl=en-US&amp;autohide=2&amp;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<p><strong>Highlights:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>The porcelain doll animation fits naturally with other scenes in the demo.<\/li>\n\n\n\n<li>Lighting, camera motion, and object placement remain faithful to the 3D engine\u2019s design.<\/li>\n\n\n\n<li>The workflow allows <strong>combining real-time content and AI-enhanced animations<\/strong> without stylistic clashes.<\/li>\n\n\n\n<li><strong>Textures and static imagery<\/strong> generated by AI maintain a unified style.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">7. Conclusion<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Simple pipeline<\/strong>: quick prototyping, ideal for storyboard \u2192 3D render \u2192 AI stylization.<\/li>\n\n\n\n<li><strong>Advanced pipeline<\/strong>: integrates depth\/normal\/motion vectors, pre\/post-processing, enabling photorealistic and temporally coherent animation.<\/li>\n\n\n\n<li><strong>Modularity and scalability<\/strong>: start simple to prototype, then incrementally add constraints and optimizations.<\/li>\n<\/ul>\n\n\n\n<p>This workflow provides a <strong>solid foundation for a hybrid 3D + AI engine<\/strong>, capable of producing stylized images or animations, fully automated, with fine creative control.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n<\/body>","protected":false},"excerpt":{"rendered":"<p>Render-Conditioned Diffusion and Hybrid Neural Rendering: From Simple Prototype to Advanced 3D Pipeline Introduction Traditional 3D rendering pipelines ensure geometric<\/p>\n","protected":false},"author":1,"featured_media":1422,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[134,133,7,66,2],"tags":[],"class_list":["post-1421","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-a-i","category-artificial-intelligence","category-coding","category-computer-graphics","category-demo"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/imalogic.com\/blog\/wp-content\/uploads\/2025\/11\/poupee.png?fit=1920%2C1080&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p8J21V-mV","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/posts\/1421","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/comments?post=1421"}],"version-history":[{"count":3,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/posts\/1421\/revisions"}],"predecessor-version":[{"id":1425,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/posts\/1421\/revisions\/1425"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/media\/1422"}],"wp:attachment":[{"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/media?parent=1421"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/categories?post=1421"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/imalogic.com\/blog\/wp-json\/wp\/v2\/tags?post=1421"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}