['[Sitemap](/sitemap/sitemap.xml)[Open in app](https://rsci.app.link/?%24canonical_url=https%3A%2F%2Fmedium.com%2Fp%2F00036c8db1ba&%7Efeature=LiOpenInAppButton&%7Echannel=ShowPostUnderCollection&%7Estage=mobileNavBar&source=post_page---top_nav_layout_nav-----------------------------------------)Sidebar menu[Medium Logo](/?source=post_page---top_nav_layout_nav-----------------------------------------)[Write](https://medium.com/new-story?source=post_page---top_nav_layout_nav-----------------------------------------)[Search](/search?source=post_page---top_nav_layout_nav-----------------------------------------)[Notifications](/me/notifications?source=post_page---top_nav_layout_nav-----------------------------------------)![Chaofan Gong](https://miro.medium.com/v2/resize:fill:64:64/1*M7RfUdpEoj6TVs4Z1Y_HQA.jpeg)[## MITB For All](https://medium.com/mitb-for-all?source=post_page---publication_nav-825f074cc537-00036c8db1ba---------------------------------------)·\nFollow publication\n[![MITB For All](https://miro.medium.com/v2/resize:fill:76:76/1*Yq5V5laVKBeLfIVDQi8Feg.png)](https://medium.com/mitb-for-all?source=post_page---post_publication_sidebar-825f074cc537-00036c8db1ba---------------------------------------)\nTech contents related to AI, Analytics, Fintech, and Digital Transformation. Written by MITB Alumni; open-access for everyone.\nFollow publication\n# Take your LLM apps to the next level with Chainlit!\n## Part 2\n[![Tituslhy](https://miro.medium.com/v2/resize:fill:64:64/1*45IjFGiqJX3pwrwCOUwQBA@2x.jpeg)](/@tituslhy?source=post_page---byline--00036c8db1ba---------------------------------------)[Tituslhy](/@tituslhy?source=post_page---byline--00036c8db1ba---------------------------------------)15 min read·May 5, 2025\n--\nListen\nShare\nMore\nPress enter or click to view image in full sizeTyping’s too slow, use a mic! Image generated by ChatGPT using author’s image\nIn my [previous article](/mitb-for-all/its-2025-start-using-chainlit-for-your-llm-apps-558db1a46315), I wrote about how to build a generative AI assistant with Google Map capabilities from the ground up. We did some amazing things:\n* Built an app to support the entire chat lifecycle: login, start chat, handle incoming messages, stop tasks, resume chats, and logout\n* Added **chat starters** to help users kick off conversations\n* Created **chat profiles** to inject different system prompts for our LLM agent\n* Implemented **chat settings** to allow advanced configuration (e.g. choosing different LLMs or tweaking temperature)\n* Built a custom **Canvas** using Google Maps, wrapped into a LlamaIndex Function Agent\nPress enter or click to view image in full size\nWe wrapped up steps 1–6 above on our home page. In this article, we’re taking things to the next level (actually, several levels): adding **image generation**, **AutoRAG**, **audio support**, and **Model Context Protocol (MCP)** capabilities.\nAs before, all my codes can be found in my [GitHub repository](https://github.com/tituslhy/sturdy-octo-fortnight).\n## 1. Adding image generation capabilities via Chainlit Commands 🖼️\nThe ChatGPT interface is able to generate images if you specify for it to do so. We can configure this in Chainlit too. Let’s add the following into app.py:\n```\n# commands = [ {"id": "Picture", "icon": "image", "description": "Use DALL-E"},]@cl.on_chat_startasync def start(): """Handler for chat start events. Sets session variables.""" await cl.context.emitter.set_commands(commands) ... # \n```\nIn Chainlit, commands are defined as a list of dictionaries. Each dictionary must include:\n* `id`: the name of the command (e.g. `"Picture"`)\n* `icon`: a [Lucide icon](https://lucide.dev/icons/) name\n* `description`: a short label for what the command does\nThese commands are set at the beginning of each chat using `set_commands()`. Once your app is restarted, a **Commands** button will appear in the message bar, letting users choose from your list — as shown below:\nPress enter or click to view image in full size\nNot so fast — clicking the command won’t do anything _yet_. We haven’t wired up the image generation logic. Since Chainlit sends commands via the message bar, we’ll need to modify our `@cl.on_message` handler. Add this code snippet to your app.py:\n```\n# from openai import AsyncOpenAIopenai_client = AsyncOpenAI(api_key="...")@cl.on_messageasync def on_message(message: cl.Message): if message.command == "Picture": response = await openai_client.images.generate( model="dall-e-3", prompt = message.content, size = "1024x1024" ) logger.info(f"Image generated. Reponse: {response}") image_url = response.data[0].url elements = [cl.Image(url=image_url)] await cl.Message(f"Here\'s what I generated for **{message.content}**", elements=elements).send() else: # your on_message logic here.\n```\nChainlit automatically detects commands attached to a message, so you can easily branch your logic based on `message.command`.\nIn this case, when we detect the `"Picture"` command, we treat the user’s message as the prompt and send it to OpenAI’s DALL·E 3 model. Once the image is generated, we extract the URL and use Chainlit’s `cl.Image()` to render it directly in the chat interface.\nPress enter or click to view image in full sizePress enter or click to view image in full size\n✨ **Voila!** You now have image generation working inside your chat app — and it looks fantastic.\n## 2. Audio capabilities 🎙️\nThis next section takes everything we’ve built so far (already “MasterChef” level) and elevates it to “Michelin Star Chef” territory.\nAt its core, enabling audio capabilities means supporting both **speech-to-text** and **text-to-speech**:\n* For **speech-to-text**, we’ll use OpenAI’s Whisper (yes, we’re already using GPT-4o-mini and DALL·E 3 — OpenAI really is a full-stack LLM shop).\n* For **text-to-speech**, we’ll use my favorite service: [**ElevenLabs**](https://elevenlabs.io/app/speech-synthesis/text-to-speech). It offers high-quality voices (even Singaporean English!) and a generous free tier. Grab an API key to get started.\n### 📦 Setup Code\nHere’s our initial setup for handling audio input:\n```\n## Setupimport osfrom dotenv import load_dotenv, find_dotenvimport chainlit as climport numpy as np_ = load_dotenv(find_dotenv())SILENCE_THRESHOLD = 3500 # Adjust based on your audio level (e.g., lower for quieter audio)SILENCE_TIMEOUT = 1300.0 # Seconds of silence to consider the turn finishedELEVENLABS_API_KEY = os.getenv("ELEVENLABS_API_KEY")ELEVENLABS_VOICE_ID = os.getenv("ELEVENLABS_VOICE_ID")@cl.on_audio_startasync def on_audio_start(): """Handler to manage mic button click event""" cl.user_session.set("silent_duration_ms", 0) cl.user_session.set("is_speaking", False) cl.user_session.set("audio_chunks", []) user = cl.user_session.get("user") logger.info(f"{user} is starting an audio stream...") return True\n```\nThe `on_audio_start` handler instantiates the audio-specific session variables when audio recording starts. Now let’s handle the actual audio stream:\n```\n@cl.on_audio_chunkasync def on_audio_chunk(chunk: cl.InputAudioChunk): """Handler function to manage audio chunks Source: Chainlit Cookbook """ audio_chunks = cl.user_session.get("audio_chunks") if audio_chunks is not None: audio_chunk = np.frombuffer(chunk.data, dtype=np.int16) audio_chunks.append(audio_chunk) # If this is the first chunk, initialize timers and state if chunk.isStart: cl.user_session.set("last_elapsed_time", chunk.elapsedTime) cl.user_session.set("is_speaking", True) return last_elapsed_time = cl.user_session.get("last_elapsed_time") silent_duration_ms = cl.user_session.get("silent_duration_ms") is_speaking = cl.user_session.get("is_speaking") # Calculate the time difference between this chunk and the previous one time_diff_ms = chunk.elapsedTime - last_elapsed_time cl.user_session.set("last_elapsed_time", chunk.elapsedTime) # Compute the RMS (root mean square) energy of the audio chunk audio_energy = audioop.rms( chunk.data, 2 ) # Assumes 16-bit audio (2 bytes per sample) if audio_energy < SILENCE_THRESHOLD: # Audio is considered silent silent_duration_ms += time_diff_ms cl.user_session.set("silent_duration_ms", silent_duration_ms) if silent_duration_ms >= SILENCE_TIMEOUT and is_speaking: cl.user_session.set("is_speaking", False) await process_audio() else: # Audio is not silent, reset silence timer and mark as speaking cl.user_session.set("silent_duration_ms", 0) if not is_speaking: cl.user_session.set("is_speaking", True)\n```\n### 🧠 How This Works\nAs audio chunks are streamed, we use `numpy` to convert the byte buffer into a numeric array. We also use Python’s `audioop.rms()` to calculate the **energy level** of each chunk. If the energy falls below a threshold for a long enough time (`SILENCE_TIMEOUT`), we assume the user has stopped speaking — and trigger audio processing.\nBefore we dive into Whisper and ElevenLabs integration, let’s abstract the message generation logic from [earlier](/mitb-for-all/its-2025-start-using-chainlit-for-your-llm-apps-558db1a46315) into a reusable function:\n```\nasync def generate_answer(query: str): agent = cl.user_session.get("agent") memory = cl.user_session.get("memory") chat_history = memory.get() msg = cl.Message("", type="assistant_message") context = cl.user_session.get("context") handler = agent.run( query, chat_history = chat_history, ctx = context ) async for event in handler.stream_events(): if isinstance(event, AgentStream): await msg.stream_token(event.delta) elif isinstance(event, ToolCall): with cl.Step(name=f"{event.tool_name} tool", type="tool"): continue response = await handler await msg.send() memory.put( ChatMessage( role = MessageRole.USER, content= query ) ) memory.put( ChatMessage( role = MessageRole.ASSISTANT, content = str(response) ) ) cl.user_session.set("memory", memory) return msg\n```\nThis lets us reuse the same flow for both text and audio inputs. Now let’s write our function to transform text to speech:\n```\nimport ioimport httpx@cl.step(type="tool")async def text_to_speech(text: str, mime_type: str): """Our main text to speech function Source: Chainlit Cookbook, ElevenLabs Documentation """ CHUNK_SIZE = 1024 url = f"https://api.elevenlabs.io/v1/text-to-speech/{ELEVENLABS_VOICE_ID}" headers = { "Accept": mime_type, "Content-Type": "application/json", "xi-api-key": ELEVENLABS_API_KEY, } data = { "text": text, "model_id": "eleven_multilingual_v2", "voice_settings": {"stability": 0.5, "similarity_boost": 0.5}, } async with httpx.AsyncClient(timeout=25.0) as client: response = await client.post(url, json=data, headers=headers) response.raise_for_status() # Ensure we notice bad responses buffer = io.BytesIO() buffer.name = f"output_audio.{mime_type.split(\'/\')[1]}" async for chunk in response.aiter_bytes(chunk_size=CHUNK_SIZE): if chunk: buffer.write(chunk) buffer.seek(0) return buffer.name, buffer.read()\n```\nWe are essentially posting a request to our ElevenLabs client to get the audio as a stream of bytes which we return to render as a downloadable element along with our message. The speech-to-text is even easier because it’s just posting the entire speech to OpenAI:\n```\nfrom openai import AsyncOpenAIopenai_client = AsyncOpenAI(api_key="...")@cl.step(type="tool")async def speech_to_text(audio_file): response = await openai_client.audio.transcriptions.create( model="whisper-1", file=audio_file, language="en", ) return response.text\n```\nThe only thing that we need to take note of is the OpenAI invocation of Whisper requires an audio_file that we must first construct.\n📌 **Tip**: Specify a language to improve accuracy. Whisper thinks I speak Malay at times.\nAnd now to finally process the audio stream head on using all our previous functions:\n```\nasync def process_audio(): """ Processes the audio buffer from the session Source: Chainlit Cookbook """ if audio_chunks := cl.user_session.get("audio_chunks"): # Concatenate all chunks concatenated = np.concatenate(list(audio_chunks)) # Create an in-memory binary stream wav_buffer = io.BytesIO() # Create WAV file with proper parameters with wave.open(wav_buffer, "wb") as wav_file: wav_file.setnchannels(1) # mono wav_file.setsampwidth(2) # 2 bytes per sample (16-bit) wav_file.setframerate(24000) # sample rate (24kHz PCM) wav_file.writeframes(concatenated.tobytes()) # Reset buffer position wav_buffer.seek(0) cl.user_session.set("audio_chunks", []) frames = wav_file.getnframes() rate = wav_file.getframerate() duration = frames / float(rate) if duration <= 1.71: print("The audio is too short, please try again.") return audio_buffer = wav_buffer.getvalue() input_audio_el = cl.Audio(content=audio_buffer, mime="audio/wav") whisper_input = ("audio.wav", audio_buffer, "audio/wav") transcription = await speech_to_text(whisper_input) #send to Whisper user = cl.user_session.get("user") logger.info(f"Received message: \'{transcription}\' from {user}") await cl.Message( author="You", type="user_message", content=transcription, elements=[input_audio_el], ).send() ## Now to answer the question msg = await generate_answer(transcription) #send to gpt-4o-mini # Send to ElevenLabs _, output_audio = await text_to_speech(msg.content, "audio/wav") output_audio_el = cl.Audio( auto_play=True, mime="audio/wav", content=output_audio, ) msg.elements=[output_audio_el] await msg.update()\n```\nHere’s what’s happening\n* We first take all the audio chunks and write a .wav file as an element to be appended to a Chainlit message.\n* Since we’ve read the audio buffer in generating this wav file we need to reset the buffer to zero, and wrap the file format and the buffer into a tuple for the OpenAI Whisper Client. We then send everything to OpenAI to get the `transcription` .\n* We send the transcription along with the .wav file as a message **from the user** ( `author=”You”`makes the message pop up on the user’s side of the chat). This .wav file is downloadable and this step is purely for user experience — so the user knows you get the question asked.\n* We then send the raw transcription to the LLM agent to generate an answer, and send the answer to ElevenLabs to generate the audio file, which we then wrap as a Chainlit `Audio` element and append it to the LLM agent’s reply. We send the reply and the audio element as a message.\nWow, that was a lot of code! I wish I could say I wrote it all from scratch, but full credit goes to the Chainlit team — this was adapted from their excellent [audio cookbook](https://github.com/Chainlit/cookbook/blob/main/openai-whisper/app.py).\nI spent quite a bit of time studying how it worked and then tailoring it to use [Azure’s Text-To-Speech service](https://learn.microsoft.com/en-us/azure/ai-services/speech-service/text-to-speech) at work. It was a great way to learn the internals of audio management while bending it to fit real-world needs (and impress your bosses). Once you’ve added all the code above into app.py, restart the app and watch the magic happen:\nWas that magic or what!\n## 3. 🔗 Auto RAG capabilities — upload any file and chat with it!\nThis is the bread and butter of any LLM application — the ability to RAG files on demand (instead of reading them). Clicking on the file attach icon pins the file as a Chainlit `element` with the message you send. We just need to edit our `on_message` code to cater for this\n```\nfrom llama_index.core import SimpleDirectoryReaderfrom llama_index.embeddings.ollama import OllamaEmbeddingfrom llama_index.llms.openai import OpenAIfrom llama_index.core.tools import QueryEngineTool@cl.on_messageasync def on_message(message: cl.Message): ... # pre-processing code if message.command == "Picture": ... # image generation code else: if len(message.elements) > 0: await cl.Message("Processing files").send() filepaths = [file.path for file in message.elements] filenames = [file.name for file in message.elements] documents = SimpleDirectoryReader(input_files=filepaths).load_data() ## Ingest documents into in-memory Vector Database. index = VectorStoreIndex.from_documents(documents, embed_model=embed_model) await cl.Message("Processed uploaded files").send() openai_llm = cl.user_session.get("llm") name = openai_llm.complete(f"Based on these filenames, come up with a short, concise name that describes these documents. For example \'MBA Value Analysis\'. Do not return any \'.pdf\' or file extensions, just the name. Filenames: {\', \'.join(filenames)}") description = openai_llm.complete(f"Based on these filenames, come up with a consolidated description that describes these documents. For example \'Answers questions about animals\'. Filenames: {\', \'.join(filenames)}") await cl.Message(f"Uploaded document/s follow the theme: {name}. Here\'s the general description of the document/s uploaded: {description}").send() tool = QueryEngineTool.from_defaults( query_engine=index.as_query_engine(similarity_top_k=8, llm=openai_llm), name = "_".join(str(name).split(" ")), description=str(description) ) agent_tools = cl.user_session.get("agent_tools", []) agent_tools.append(tool) agent = FunctionAgent(tools=agent_tools, llm=openai_llm) cl.user_session.set("agent", agent) cl.user_session.set("agent_tools", agent_tools) await generate_answer(message.content) #use our new generate_answer function from before\n```\nWhen Chainlit detects uploaded files, it temporarily stores them in a `.files` directory and exposes the paths via `message.elements`. We then:\n* Use `SimpleDirectoryReader` to load the files (works with PDFs, PPTs, Word, Excel, even audio/video if Whisper is available).\n* Create a temporary index for similarity search.\n* Dynamically generate metadata (name and description) for the tool using the OpenAI LLM.\n* Convert the index to a query engine (with just one line of code — thank you LlamaIndex!) and register a `QueryEngineTool` as a tool in this user session for the agent to use.\n📌 **Tip**: Swap `SimpleDirectoryReader` with `LlamaParse` for smarter document chunking and cleaner structure.\nPress enter or click to view image in full size\nNote that the uploaded files are tied to the _current user session_. Once the chat ends, those files (and their corresponding tools) are lost unless you implement persistence.\nNote that the uploaded files are tied to the _current user session_. Once the chat ends, those files (and their corresponding tools) are lost unless you implement persistence.\n```\n@cl.on_chat_resumeasync def on_chat_resume(thread: ThreadDict): ... #your on chat resume logic await cl.Message("Chat resumed. Do note that previously uploaded documents will not be available in this chat and must be uploaded again").send()\n```\n## 4. 🔌 Model Context Protocol: Chat with all tools!\nSo far, we’ve learned how to:\n* Define tools statically in code (google maps).\n* Dynamically register tools to an agent during a session (auto RAG).\nNow it’s time to level up: let’s give our LLM agent the ability to access _any available tools on demand_ based on user needs — using the **Model Context Protocol** (MCP)! I’ve previously written an [article about MCPs](/mitb-for-all/model-context-protocol-hype-or-necessity-998874d95bc6) — feel free to check it out, but here’s the gist:\nThe value of an MCP lies in enabling **dynamic tool availability** at runtime. Your LLM can connect to or disconnect from external services like Jira or Confluence directly in chat — enhancing user flexibility and keeping the frontend experience fluid.\nOfficial vendors often provide Docker-based MCP servers — or you can build your own. For this demo, let’s integrate Jira and allow our LLM to manage tickets in real time.\n### 🛠 Step 1: Handle MCP Connections\nWhen a user connects to an MCP server, we’ll fetch the available tools, unpack them using LlamaIndex’s `to_tool_list_async()`, and update the agent accordingly:\n```\n@cl.on_mcp_connectasync def on_mcp_connect(connection): """Handler to connect to an MCP server. Lists tools available on the server and connects these tools to the LLM agent.""" openai_llm = cl.user_session.get("llm") mcp_cache = cl.user_session.get("mcp_tool_cache", {}) mcp_tools = cl.user_session.get("mcp_tools", {}) agent_tools = cl.user_session.get("agent_tools", []) try: logger.info("Connecting to MCP") mcp_client = BasicMCPClient(connection.url) logger.info("Connected to MCP") mcp_tool_spec = McpToolSpec(client=mcp_client) logger.info("Unpacking tools") new_tools = await mcp_tool_spec.to_tool_list_async() for tool in new_tools: if tool.metadata.name not in mcp_tools: mcp_tools[tool.metadata.name] = tool mcp_cache[connection.name].append(tool.metadata.name) agent = FunctionAgent( tools=agent_tools + list(mcp_tools.values()), llm=openai_llm, ) cl.user_session.set("agent", agent) cl.user_session.set("context", Context(agent)) cl.user_session.set("mcp_tools", mcp_tools) cl.user_session.set("mcp_tool_cache", mcp_cache) await cl.Message(f"Connected to MCP server: {connection.name} on {connection.url}", type="assistant_message").send() await cl.Message( f"Found {len(new_tools)} tools from {connection.name} MCP server.", type="assistant_message" ).send() except Exception as e: await cl.Message(f"Error conecting to tools from MCP server: {str(e)}", type="assistant_message").send()\n```\nThis ensures any newly discovered tools are merged into the session’s toolset.\n### 🔌 Step 2: Handle Disconnections\nWhen a user disconnects from an MCP server, we’ll cleanly remove its tools:\n```\n@cl.on_mcp_disconnectasync def on_mcp_disconnect(name: str): """Handler to handle disconnects from an MCP server. Updates tool list available for the LLM agent. """ openai_llm = cl.user_session.get("llm") agent_tools = cl.user_session.get("agent_tools", []) mcp_tools = cl.user_session.get("mcp_tools", {}) mcp_cache = cl.user_session.get("mcp_tool_cache", {}) if name in mcp_cache: for tool_name in mcp_cache[name]: del mcp_tools[tool_name] del mcp_cache[name] # Update tools list in agent if len(mcp_tools)>0: agent = FunctionAgent( tools=agent_tools + list(mcp_tools.values()), #agent still has tools not removed llm=openai_llm, ) else: agent = FunctionAgent( tools=agent_tools, llm=openai_llm, ) cl.user_session.set("context", Context(agent)) cl.user_session.set("mcp_tools", mcp_tools) cl.user_session.set("mcp_tool_cache", mcp_cache) cl.user_session.set("agent", agent) await cl.Message(f"Disconnected from MCP server: {name}", type="assistant_message").send()\n```\n### 🐳 Step 3: Run Jira MCP via Docker\nJira’s MCP is available on [Docker](https://github.com/sooperset/mcp-atlassian) (do check with the official MCP tool page of the service you’re looking for), so all we need to do is [get a Jira API token](https://support.atlassian.com/atlassian-account/docs/manage-api-tokens-for-your-atlassian-account/), pull the image and run it:\n```\ndocker run --rm -p 9000:9000 \\ --env-file ./.env \\ ghcr.io/sooperset/mcp-atlassian:latest \\ --transport sse --port 9000 -vv\n```\nOnce running, click the plug icon in our Chainlit home page to connect.\nPress enter or click to view image in full sizePress enter or click to view image in full size\nYup there are 26 tools!\nPress enter or click to view image in full size\nNow let’s have some fun! Let’s get GPT-4o-mini to check for active issues in our current sprint (note that you’ll need to know at least the project key of your Jira project — in my case it’s “JIRACHAT”)\nPress enter or click to view image in full size\nSince we’ve already written all the features above, let’s resolve these issues!\nPress enter or click to view image in full size\nYou can also tell the LLM to complete the sprint! What you get is a nice Jira home page that is clean indicating that the issues have been completed.\nPress enter or click to view image in full size\n_PS: This MCP image also works for Confluence! So you can get your LLM to write sprint retrospectives on Confluence, or use this MCP to RAG Confluence pages._\n## It gets better — deployment\nChainlit apps don’t just have to be deployed as standalone web applications. Chainlit apps can also be [mounted as a copilot onto a website](https://docs.chainlit.io/deploy/copilot) as a chat bubble, deployed on [Teams](https://docs.chainlit.io/deploy/teams), [Slack](https://docs.chainlit.io/deploy/slack) or even [Discord](https://docs.chainlit.io/deploy/discord) as a chatbot. The same `app.py` we built has incredible mileage — you barely need to change a thing.\nIt must be said however, that you tend to lose control over the user experience when deploying Chainlit chatbots via the aforementioned alternatives — because Chainlit must comply to the service requirements of these services. You do need administrative access to deploy on Teams, Slack and Discord which I don’t have so I’ll just briefly demonstrate copilot deployment.\nYou’re going to have to know quite a bit of JavaScript and use the Chainlit `CopilotFunction` abstraction to shuttle arguments between the frontend and your Chainlit bot — I’ve done this at work with some struggle. The challenge comes from needing to authenticate the user at the main website first, which means that your app.py’s `password_auth_callback` will throw an error.\nInstead of adding to our almost 600 line app.py, I’ve elected to code a simple chatbot to quickly help illustrate the core concepts needed:\n```\n#simple_app.pyimport chainlit as clfrom llama_index.llms.openai import OpenAIllm = OpenAI(\'gpt-4o-mini\', temperature=0)@cl.on_messageasync def on_message(message: cl.Message): reply = await llm.acomplete(message.content) response = await cl.Message(content=str(reply)).send() if cl.context.session.client_type == "copilot": fn = cl.CopilotFunction( name="test", args={"message": message.content, "response": response.content} ) await fn.acall()\n```\nYup it’s just a chatbot that answers questions — we didn’t even implement memory and it doesn’t even have an `on_chat_start`!\nIf Chainlit detects that the context of the client’s session is copilot mode, the `cl.CopilotFunction` class creates a function call event that is sent from the Chainlit backend to the embedded Copilot widget on your website or application.\nWhen you instantiate it with a `name` and `args`, and then invoke `.acall()`, Chainlit emits a `chainlit-call-fn` event to the front end, carrying those parameters. A corresponding JavaScript event listener on the host page picks up that event, executes the requested action (e.g., updating UI or fetching data), and returns a result via a callback — which is then propagated back into your Chainlit app for further processing\nIn our index.html, be sure to add the codes in the script tags. This mounts the Chainlit widget as a little chat bubble and instantiates an event listener to pass the message to our Chainlit application.\n```\n Document \n```\nAlso, be sure to edit your `.chainlit/config.toml` file to allow the origin that your website is deployed on. This is annoying because “*” typically results in a failure to fetch error so you have to be explicit.\n```\nallow_origins = ["http://127.0.0.1:5500"]\n```\nThe result looks really good!\nPress enter or click to view image in full size\nYou can even click the top right hand corner of the chat bubble to expand the chat window. Copilot mode supports all the advanced features we built earlier!\nPress enter or click to view image in full size\nJust be mindful to handle the authentication process in your Chainlit application on your website. The sticking point is passing authenticated user context from the host site into the Chainlit app — especially since `password_auth_callback` doesn’t play nicely with copilot embeds.\nIf you manage to get auth working cleanly, do drop me a note — I’d love to learn from your setup too!\n## **Concluding thoughts**\nWe’ve covered a lot of ground! Developing LLM applications with Chainlit is genuinely enjoyable — it offers a level of simplicity and flexibility that’s hard to beat. The mileage you can get out of a single `app.py` file is impressive. I hope this series has served as both a practical starting point and a useful cookbook for anyone looking to build with Chainlit.\n_Disclaimer: All opinions and interpretations are that of the writer, and not of MITB. I declare that I have full rights to use the contents published here, and nothing is plagiarized. I declare that this article is written by me and not with any generative AI tool such as ChatGPT. I declare that no data privacy policy is breached, and that any data associated with the contents here are obtained legitimately to the best of my knowledge. I agree not to make any changes without first seeking the editors’ approval. Any violations may lead to this article being retracted from the publication._\n[Chainlit](/tag/chainlit?source=post_page-----00036c8db1ba---------------------------------------)[Elevenlabs Ai](/tag/elevenlabs-ai?source=post_page-----00036c8db1ba---------------------------------------)[Llamaindex](/tag/llamaindex?source=post_page-----00036c8db1ba---------------------------------------)[Llm](/tag/llm?source=post_page-----00036c8db1ba---------------------------------------)[Data Science](/tag/data-science?source=post_page-----00036c8db1ba---------------------------------------)\n--\n--\n[![MITB For All](https://miro.medium.com/v2/resize:fill:96:96/1*Yq5V5laVKBeLfIVDQi8Feg.png)](https://medium.com/mitb-for-all?source=post_page---post_publication_info--00036c8db1ba---------------------------------------)[![MITB For All](https://miro.medium.com/v2/resize:fill:128:128/1*Yq5V5laVKBeLfIVDQi8Feg.png)](https://medium.com/mitb-for-all?source=post_page---post_publication_info--00036c8db1ba---------------------------------------)Follow[## Published in MITB For All](https://medium.com/mitb-for-all?source=post_page---post_publication_info--00036c8db1ba---------------------------------------)[260 followers](/mitb-for-all/followers?source=post_page---post_publication_info--00036c8db1ba---------------------------------------)·[Last published\xa02 days ago](/mitb-for-all/langchain4j-building-generative-ai-applications-in-java-670ec3d6167d?source=post_page---post_publication_info--00036c8db1ba---------------------------------------)\nTech contents related to AI, Analytics, Fintech, and Digital Transformation. Written by MITB Alumni; open-access for everyone.\nFollow[![Tituslhy](https://miro.medium.com/v2/resize:fill:96:96/1*45IjFGiqJX3pwrwCOUwQBA@2x.jpeg)](/@tituslhy?source=post_page---post_author_info--00036c8db1ba---------------------------------------)[![Tituslhy](https://miro.medium.com/v2/resize:fill:128:128/1*45IjFGiqJX3pwrwCOUwQBA@2x.jpeg)](/@tituslhy?source=post_page---post_author_info--00036c8db1ba---------------------------------------)[## Written by Tituslhy](/@tituslhy?source=post_page---post_author_info--00036c8db1ba---------------------------------------)[276 followers](/@tituslhy/followers?source=post_page---post_author_info--00036c8db1ba---------------------------------------)·[8 following](/@tituslhy/following?source=post_page---post_author_info--00036c8db1ba---------------------------------------)\n## No responses yet\n[https://policy.medium.com/medium-rules-30e5502c4eb4?source=post_page---post_responses--00036c8db1ba---------------------------------------](https://policy.medium.com/medium-rules-30e5502c4eb4?source=post_page---post_responses--00036c8db1ba---------------------------------------)[Help](https://help.medium.com/hc/en-us?source=post_page-----00036c8db1ba---------------------------------------)[Status](https://status.medium.com/?source=post_page-----00036c8db1ba---------------------------------------)[About](/about?autoplay=1&source=post_page-----00036c8db1ba---------------------------------------)[Careers](/jobs-at-medium/work-at-medium-959d1a85284e?source=post_page-----00036c8db1ba---------------------------------------)[Press](mailto:pressinquiries@medium.com)[Blog](https://blog.medium.com/?source=post_page-----00036c8db1ba---------------------------------------)[Privacy](https://policy.medium.com/medium-privacy-policy-f03bf92035c9?source=post_page-----00036c8db1ba---------------------------------------)[Rules](https://policy.medium.com/medium-rules-30e5502c4eb4?source=post_page-----00036c8db1ba---------------------------------------)[Terms](https://policy.medium.com/medium-terms-of-service-9db0094a1e0f?source=post_page-----00036c8db1ba---------------------------------------)[Text to speech](https://speechify.com/medium?source=post_page-----00036c8db1ba---------------------------------------)\n', '[https://www.zhihu.com](https://www.zhihu.com)[关注](https://www.zhihu.com/follow)[推荐](https://www.zhihu.com/)[热榜](https://www.zhihu.com/hot)[专栏](https://www.zhihu.com/column-square)[圈子New](https://www.zhihu.com/ring-feeds)[付费咨询](https://www.zhihu.com/consult)[知学堂](https://www.zhihu.com/education/learning)\u200b[直答](https://zhida.zhihu.com/)消息私信![点击打开undefined的主页](https://pic1.zhimg.com/v2-abed1a8c04700ba7d72b45195223e0ff_l.jpeg)\n# 拓扑涡旋理论(TVT)助力人工智能(AI)开创科学研究新范式\n[![零维空间](https://pic1.zhimg.com/v2-a67b4dcd71291db1f22c554c48e98c80_l.jpg?source=172ae18b)](//www.zhihu.com/people/topological-vortex)[零维空间](//www.zhihu.com/people/topological-vortex)时序位点,涡旋无边;世间万物,道法自旋。\u200b关注他[收录于 · 拓扑涡旋理论(TVT)](https://www.zhihu.com/column/c_1884524011255492993)创作声明:包含 AI 辅助创作\n## **Topological Vortex Theory Helps AI Enable a New Paradigm in Scientific Research**\n### **张保华**\n**Bao-hua ZHANG**\n**摘要 Abstract:**\n**拓扑涡旋理论(Topological vortex theory,TVT)**,作为现代凝聚态物理和场论中的核心概念之一,描述了在复杂系统中出现的具有拓扑保护的稳定结构。这些结构,从量子材料中的斯格明子(Skyrmions)到宇宙学中的时空缺陷,展现了非凡的鲁棒性和丰富的动力学行为。与此同时,人工智能(AI),特别是机器学习(ML)和深度学习(DL),正以前所未有的方式重塑科学发现的过程。本文旨在探讨一个新兴的交叉前沿:如何利用AI技术,特别是基于物理的神经网络(PINNs)、生成模型和强化学习,来加速、深化和革新基于TVT的科学研究。我们将系统阐述AI在**理论建模与模拟**、**实验数据解析**、**新材料与现象预测**以及**构建自主科学发现系统**四个关键层面的赋能作用,并最终论证这种融合将催生一种以“数据驱动”和“智能涌现”为特征的科学研究新范式。\n**Topological vortex theory (TVT)**, one of the central concepts in modern condensed matter physics and field theory, describes stable structures with topological protection that emerge in complex systems. These structures, from skyrmions in quantum materials to spacetime defects in cosmology, exhibit extraordinary robustness and rich dynamic behaviors. Meanwhile, artificial intelligence (AI), particularly machine learning (ML) and deep learning (DL), is reshaping the process of scientific discovery in unprecedented ways. This paper aims to explore an emerging interdisciplinary frontier: how to leverage AI technologies, especially physics-informed neural networks (PINNs), generative models, and reinforcement learning, to accelerate, deepen, and revolutionize scientific research based on topological vortex theory. We will systematically elaborate on the enabling role of AI in four key aspects: **theoretical modeling and simulation, experimental data analysis, prediction of new materials and phenomena, and the construction of autonomous scientific discovery systems**. Ultimately, we argue that this fusion will catalyze a new paradigm of scientific research characterized by "data-driven" and "intelligent emergence".\n**关键词:** 拓扑涡旋;人工智能;科学发现;机器学习;多物理场模拟;斯格明子;逆向设计\n**Keywords:** topological vortex; artificial intelligence; scientific discovery; machine learning; multiphysics simulation; skyrmion; inverse design\n## **1. 引言 **Introduction\n拓扑涡旋是一种非平庸的拓扑激发,其稳定性由系统的整体拓扑性质而非局部能量最小化所保证。这使得它们在面对外界扰动时表现出惊人的稳定性,成为了下一代信息存储(如拓扑磁存储器)、量子计算(如任意子编织)和功能材料设计中的明星候选者。然而,对拓扑涡旋的研究面临着巨大的挑战:\nTopological vortices are non-trivial topological excitations whose stability is guaranteed by the global topological properties of the system rather than local energy minimization. This gives them astonishing stability against external perturbations, making them star candidates for next-generation information storage (e.g., topological magnetic memory), quantum computing (e.g., anyon braiding), and functional material design. However, research on topological vortices faces significant challenges:\n**1)理论复杂度高:** 其动力学往往涉及高度非线性的偏微分方程(如Landau-Lifshitz-Gilbert方程),解析求解极其困难。\n**High Theoretical Complexity:** Their dynamics often involve highly nonlinear partial differential equations (e.g., the Landau-Lifshitz-Gilbert equation), making analytical solutions extremely difficult.\n**2)多尺度特性:** 其行为跨越微观量子效应到宏观经典现象,传统的数值模拟(如微磁学模拟)计算成本高昂。\n**Multiscale Characteristics:** Their behavior spans from microscopic quantum effects to macroscopic classical phenomena, making traditional numerical simulations (e.g., micromagnetic simulations) computationally expensive.\n**3)实验探测难:** 观测手段(如 Lorentz TEM, MFM, STM)产生的数据海量且嘈杂,从中精确提取涡旋的形态、动力学参数和拓扑荷极具挑战。\n**Difficulty in Experimental Detection:** Observation techniques (e.g., Lorentz TEM, MFM, STM) generate massive and noisy data, making the precise extraction of vortex morphology, dynamic parameters, and topological charge highly challenging.\n**4)设计空间广阔:** 探索具有新颖拓扑态的新材料(如磁性范德华材料、超导体)如同“大海捞针”。\n**Vast Design Space:** Exploring new materials (e.g., magnetic van der Waals materials, superconductors) with novel topological states is like "searching for a needle in a haystack."\nAI技术,尤其是其从数据中学习复杂模式和无监督发现规律的能力,为解决这些长期挑战提供了革命性的工具。本文旨在构建一个统一的框架,阐明AI如何作为“赋能者”和“加速器”,全面介入拓扑涡旋研究的生命周期。\nAI technology, particularly its ability to learn complex patterns from data and discover rules unsupervised, provides revolutionary tools to address these long-standing challenges. This paper aims to construct a unified framework to clarify how AI acts as an "enabler" and "accelerator," fully介入 (intervening in) the lifecycle of topological vortex research.\n## **2. AI在拓扑涡旋理论与模拟中的赋能 **AI Empowerment in Topological Vortex Theory and Simulation\n### **2.1 代理模型与加速计算 Surrogate Models and Accelerated Computation**\n传统的微磁学或分子动力学模拟需要耗费巨大的计算资源来追踪每一个磁矩或原子的演化。AI可以训练**代理模型(Surrogate Model)**,即用深度神经网络来学习高保真模拟的输入-输出关系。一旦训练完成,这个神经网络可以在毫秒级别内预测出在不同外部条件(如磁场、电流、温度)下拓扑涡旋的稳定性和动力学行为,比传统数值方法快数个数量级。这使得大规模参数扫描和相图绘制成为可能。\nTraditional micromagnetic or molecular dynamics simulations require enormous computational resources to track the evolution of every magnetic moment or atom. AI can train **surrogate models**, i.e., use deep neural networks to learn the input-output relationships of high-fidelity simulations. Once trained, this neural network can predict the stability and dynamics of topological vortices under different external conditions (e.g., magnetic field, current, temperature) in milliseconds, several orders of magnitude faster than traditional numerical methods. This enables large-scale parameter scanning and phase diagram mapping.\n### **2.2 求解复杂偏微分方程 Solving Complex Partial Differential Equations**\n基于物理的神经网络(Physics-Informed Neural Networks, PINNs)将物理定律(如能量泛函、运动方程)以损失函数的形式嵌入神经网络。PINNs无需大量标注数据,可直接求解描述拓扑涡旋的偏微分方程。它们特别擅长处理**逆问题**,例如:给定实验观测到的涡旋运动轨迹,反推出材料的关键物理参数(如阻尼常数、交换刚度常数)。\nPhysics-Informed Neural Networks (PINNs) embed physical laws (e.g., energy functionals, equations of motion) into the neural network in the form of a loss function. PINNs can directly solve the partial differential equations describing topological vortices without requiring extensive labeled data. They are particularly adept at handling **inverse problems**, e.g., inferring key physical parameters of a material (such as damping constant, exchange stiffness constant) from experimentally observed vortex motion trajectories.\n## **3. AI在实验数据解析与知识提取中的赋能 **AI Empowerment in Experimental Data Analysis and Knowledge Extraction\n### **3.1 智能图像识别与分割 Intelligent Image Recognition and Segmentation**\n利用卷积神经网络(CNN)对透射电子显微镜(TEM)或磁力显微镜(MFM)图像进行实时、高精度的拓扑涡旋识别、计数和分割。AI可以克服图像中的噪声、伪影,甚至能从模糊的衬度中推断出磁化矢量方向,**自动标注拓扑荷(Topological Charge)**,极大减轻了人工分析的负担。\nUsing convolutional neural networks (CNNs) for real-time, high-precision identification, counting, and segmentation of topological vortices in transmission electron microscopy (TEM) or magnetic force microscopy (MFM) images. AI can overcome noise and artifacts in images, even inferring magnetization vector directions from blurry contrast, **automatically labeling topological charge**, greatly reducing the burden of manual analysis.\n### **3.2 动力学轨迹分析与特征提取 Dynamic Trajectory Analysis and Feature Extraction**\n通过递归神经网络(RNN)或Transformer模型分析时间序列数据(如视频),AI可以自动追踪多个涡旋的运动轨迹,分类它们的相互作用模式(吸引、排斥、湮灭),并提取难以人工察觉的微观动力学特征,为建立更精确的理论模型提供数据支持。\nBy analyzing time-series data (e.g., video) with recurrent neural networks (RNNs) or Transformer models, AI can automatically track the motion trajectories of multiple vortices, classify their interaction modes (attraction, repulsion, annihilation), and extract microscopic dynamic features difficult to perceive manually, providing data support for building more accurate theoretical models.\n## **4. AI在新材料与新现象预测中的赋能 **AI Empowerment in Predicting New Materials and Phenomena\n### **4.1 生成式设计与逆向设计 Generative and Inverse Design**\n生成对抗网络(GAN)或扩散模型可以学习已知拓扑材料的数据库(晶体结构、成分、能带结构),然后**生成**具有理想性能的全新虚拟材料结构。例如,我们可以指定目标:“生成一种在室温下稳定存在且可用低电流驱动斯格明子的材料”,AI将在巨大的化学空间中进行搜索和“想象”,为实验合成提供优先候选目标。\nGenerative adversarial networks (GANs) or diffusion models can learn from databases of known topological materials (crystal structure, composition, band structure) and then **generate** entirely new virtual material structures with desired properties. For example, specifying the goal: "Generate a material that hosts stable skyrmions at room temperature drivable by low current," AI will search and "imagine" within the vast chemical space, providing prioritized candidate targets for experimental synthesis.\n### **4.2 预测新奇拓扑态 Predicting Novel Topological States**\n通过无监督学习(如聚类算法)分析大量模拟或实验数据,AI可以发现人类未曾预设的新奇拓扑相或相变临界点。它能够揭示序参量之间复杂的关联性,从而预言全新的拓扑激发态,引导理论物理学家建立新的理论框架来解释这些AI发现的“反常”现象。\nThrough unsupervised learning (e.g., clustering algorithms) analyzing large amounts of simulation or experimental data, AI can discover novel topological phases or critical points of phase transitions not preset by humans. It can reveal complex correlations between order parameters, thereby predicting brand-new topological excitations, guiding theoretical physicists to establish new theoretical frameworks to explain these "anomalous" phenomena discovered by AI.\n## **5. 迈向自主科学发现:闭环智能系统 Towards Autonomous Scientific Discovery: Closed-Loop Intelligent Systems**\nAI赋能的最高形态是构建**自主科学发现系统**。这是一个将上述所有能力整合的闭环:\nThe highest form of AI empowerment is building **autonomous scientific discovery systems**. This is a closed loop integrating all the above capabilities:\n**1)AI设计**一个实验方案(如一种新材料或一个测量 protocol)。\n**AI designs** an experimental plan (e.g., a new material or a measurement protocol).\n**2)机器人实验平台**自动执行合成与测量。\nA **robotic experimental platform** automatically executes synthesis and measurement.\n**3)AI实时分析**产生的数据。\n**AI analyzes** the generated data in real-time.\n**4)**根据分析结果,**AI规划**下一步最优实验,以验证假设或缩小搜索空间。\nBased on the analysis results, **AI plans** the next optimal experiment to test hypotheses or narrow the search space.\n在这个循环中,人类科学家的角色从直接操作者转变为目标设定者和最终诠释者。对于拓扑涡旋研究,这样的系统可以自动探索庞大的材料成分、异质结结构、外加场参数空间,以前所未有的速度发现最优化的拓扑器件原型。\nIn this cycle, the role of the human scientist shifts from direct operator to goal setter and ultimate interpreter. For topological vortex research, such a system could automatically explore vast parameter spaces of material composition, heterostructure, and external fields, discovering optimized topological device prototypes at unprecedented speeds.\n## **6. 挑战与展望 Challenges and Outlook**\n尽管前景广阔,这一融合领域仍面临挑战:\nDespite the promising prospects, this integrated field still faces challenges:\n**1)数据质量与稀缺:** 高质量、标注良好的拓扑物理数据集仍然缺乏。\n**Data Quality and Scarcity:** High-quality, well-annotated datasets for topological physics are still lacking.\n**2)模型的可解释性:** AI模型的“黑箱”特性使得其预测的物理机制有时难以理解。发展可解释AI(XAI)至关重要。\n**Model Interpretability:** The "black box" nature of AI models sometimes makes the physical mechanisms behind their predictions difficult to understand. Developing explainable AI (XAI) is crucial.\n**3)物理原理的嵌入:** 如何更深刻地将物理守恒律、对称性等先验知识嵌入AI模型,是保证其预测物理合理性的关键。\n**Embedding Physical Principles:** How to more deeply embed prior knowledge such as physical conservation laws and symmetries into AI models is key to ensuring the physical plausibility of their predictions.\n未来,我们期待看到一个由AI驱动的“拓扑涡旋信息学”领域的兴起。AI将不仅是工具,更是合作者,它将帮助科学家解开拓扑涡旋中最深层次的奥秘,并最终将这些知识转化为变革性的技术。\nIn the future, we anticipate the rise of an AI-driven "Topological Vortex Informatics" field. AI will not only be a tool but also a collaborator, helping scientists unravel the deepest mysteries of topological vortices and ultimately translate this knowledge into transformative technologies.\n## **7. 结论 **Conclusion\n**拓扑涡旋理论(TVT)**与人工智能(AI)的结合,代表了一场科学方法论的范式转移。AI通过其强大的模式识别、优化和生成能力,正在穿透拓扑物理研究中的传统瓶颈,从加速计算、解析数据到预测设计,全方位地为科学研究赋能。这种协同作用不仅会催生对拓扑物理本身更深刻的理解,更将塑造一种更高效、更智能、更具探索性的新一代科学研究模式。最终,这股“智能之旋”可能席卷整个科学界,将其推向一个新的高度。\nThe integration of topological vortex theory and artificial intelligence represents a paradigm shift in scientific methodology. Through its powerful capabilities in pattern recognition, optimization, and generation, AI is penetrating traditional bottlenecks in topological physics research, empowering scientific research in all aspects from accelerating computation and parsing data to predictive design. This synergy will not only foster a deeper understanding of topological physics itself but also shape a more efficient, intelligent, and exploratory new generation of scientific research mode. Ultimately, this "vortex of intelligence" will sweep across the entire scientific community, pushing it to new heights.\n### **参考文献 References**\n1. **Skyrme, T. H. R. (1962). **A unified field theory of mesons and baryons. Nuclear Physics, 31, 556-569.\n2. **Mühlbauer, S., Binz, B., Jonietz, F., Pfleiderer, C., Rosch, A., Neubauer, A., Georgii, R., & Böni, P. (2009). **Skyrmion lattice in a chiral magnet. Science, 323(5916), 915-919.\n3. **Nagaosa, N., & Tokura, Y. (2013). **Topological properties and dynamics of magnetic skyrmions. Nature Nanotechnology, 8(12), 899-911.\n4. **Fert, A., Reyren, N., & Cros, V. (2017). **Magnetic skyrmions: advances in physics and potential applications. Nature Reviews Materials, 2(7), 17031.\n5. **Carleo, G., Cirac, I., Cranmer, K., Daudet, L., Schuld, M., Tishby, N., Vogt-Maranto, L., & Zdeborová, L. (2019). **Machine learning and the physical sciences. Reviews of Modern Physics, 91(4), 045002.\n6. **Butler, K. T., Davies, D. W., Cartwright, H., Isayev, O., & Walsh, A. (2018). **Machine learning for molecular and materials science. Nature, 559(7715), 547-555.\n7. **Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2019). **Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378, 686-707.\n8. **Karniadakis, G. E., Kevrekidis, I. G., Lu, L., Perdikaris, P., Wang, S., & Yang, L. (2021). **Physics-informed machine learning. Nature Reviews Physics, 3(6), 422-440.\n9. **Gómez-Bombarelli, R., Aguilera-Iparraguirre, J., Hirzel, T. D., Duvenaud, D., Maclaurin, D., Blood-Forsythe, M. A., ... & Aspuru-Guzik, A. (2016). **Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach. Nature Materials, 15(10), 1120-1127.\n10. **Zhang, P., & Wang, H. (2021). **Machine learning topological phases in real space. Physical Review B, 103(15), 155143.\n11. **Greitemann, J., Liu, K., & Pollet, L. (2019). **Probing hidden spin order with interpretable machine learning. Physical Review B, 99(6), 060404.\n12. **MacLeod, B. P., Parlane, F. G., Morrissey, T. D., Häse, F., Roch, L. M., Dettelbach, K. E., ... & Berlinguette, C. P. (2020). **A self-driving laboratory advances the Pareto front for material properties. Nature Communications, 11(1), 1-10.\n13. **Battaglia, P. W., Hamrick, J. B., Bapst, V., Sanchez-Gonzalez, A., Zambaldi, V., Malinowski, M., ... & Pascanu, R. (2018). **Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261.\n14. **Samek, W., Montavon, G., Vedaldi, A., Hansen, L. K., & Müller, K. R. (Eds.). (2019). **Explainable AI: interpreting, explaining and visualizing deep learning. Springer Nature.\n### **学术声明 Academic Statement**\n本文阐述的观点、理论框架及展望均代表作者的学术见解。人工智能(**DeepSeek-V3**)在本文件的撰写过程中被用作辅助工具,用于文献检索、思路拓展及语言优化,但核心论点、理论构建和学术判断均由作者独立完成并负责。本文旨在进行学术探讨与交流,未经作者明确许可,请勿将本文全部或部分内容用于商业目的。如需引用本文中的观点或内容,请采用适当的学术引用格式进行标注。\nThe viewpoints, theoretical frameworks, and prospects expressed herein represent the author\'s academic opinions. Artificial intelligence (**DeepSeek-V3**) was used as an auxiliary tool in the preparation of this document for literature retrieval, idea expansion, and language optimization. However, the core arguments, theoretical construction, and academic judgments were independently completed and are the responsibility of the author. This article is intended for academic discussion and exchange. Without the explicit permission of the author, please do not use the entire or partial content of this document for commercial purposes. If you need to cite the viewpoints or content of this article, please use appropriate academic citation formats for attribution.\n### **友情链接 Related readings**\n**[1]** [空间是孕育时空与物质的物理实体 Space Is A Physical Entity Nurturing Spacetime and Matter](https://zhuanlan.zhihu.com/p/1930897490367973024)。**[2]** [空间相变 Space Phase Transitions](https://zhuanlan.zhihu.com/p/1933828835322856603)。**[3]** [论绝对空间背景与理想流体的动力学相关性](https://zhuanlan.zhihu.com/p/1950477539916091826)。**[4]** [拓扑涡旋时空本体论(TVSTO)简介](https://zhuanlan.zhihu.com/p/1939230807551095002)。**[5]** [基于拓扑涡旋理论(TVT)的自由能泛函与拓扑流形通用表达研究](https://zhuanlan.zhihu.com/p/1956018957728354905)。**[6]** [基于拓扑涡旋理论(TVT)探讨狄拉克方程的拓扑本质](https://zhuanlan.zhihu.com/p/1954126217461602098)。**[7]** [基于相对论与量子论时空背景探讨拓扑涡旋理论(TVT)的科学合理性](https://zhuanlan.zhihu.com/p/1953487291109577704)。**[8]** [基于拓扑涡旋理论(TVT)阐释“More is Different”](https://zhuanlan.zhihu.com/p/1951573390151378560)。**[9]** [论拓扑涡旋理论(TVT)与质能方程的同构性](https://zhuanlan.zhihu.com/p/1948342728527185301)。**[10]** [基于拓扑涡旋理论(TVT)的涡旋成核数学框架](https://zhuanlan.zhihu.com/p/1941514490928686253)。\n发布于 2025-10-01 09:26・陕西[人工智能](//www.zhihu.com/topic/19551275)[拓扑相变](//www.zhihu.com/topic/20531236)[科学革命](//www.zhihu.com/topic/29357416)\u200b赞同\u200b\u200b添加评论\u200b分享\u200b喜欢\u200b收藏\u200b申请转载\u200b关于作者[![零维空间](https://pic1.zhimg.com/v2-a67b4dcd71291db1f22c554c48e98c80_l.jpg?source=172ae18b)](//www.zhihu.com/people/topological-vortex)[零维空间](//www.zhihu.com/people/topological-vortex)时序位点,涡旋无边;世间万物,道法自旋。[回答](//www.zhihu.com/people/topological-vortex/answers)[文章](//www.zhihu.com/people/topological-vortex/posts)[关注者](//www.zhihu.com/people/topological-vortex/followers)\u200b关注他\u200b发私信', ' \n[Skip to content](#that-jump-content--default)\n[Bloomberg the Company & Its ProductsThe Company & its Products](https://www.bloomberg.com/company/?utm_source=bloomberg-menu&utm_medium=graphics)[Bloomberg Terminal Demo Request](https://www.bloomberg.com/professional/contact-menu/?utm_source=bloomberg-menu&utm_medium=graphics&bbgsum=DG-WS-PROF-DEMO-bbgmenu)[Bloomberg Anywhere Remote LoginBloomberg Anywhere Login](https://bba.bloomberg.net/?utm_source=bloomberg-menu&utm_medium=graphics)[Bloomberg Customer SupportCustomer Support](https://www.bloomberg.com/professional/support/?utm_source=bloomberg-menu&utm_medium=graphics) \n* ### Bloomberg\nConnecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world\n### For Customers\n* [Bloomberg Anywhere Remote Login](https://bba.bloomberg.net/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Software Updates](https://www.bloomberg.com/professional/support/customer-support/software-updates/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Manage Products and Account Information](https://service.bloomberg.com/portal/sessions/new?utm_source=bloomberg-menu&utm_medium=graphics)\n### Support\nAmericas+1 212 318 2000\nEMEA+44 20 7330 7500\nAsia Pacific+65 6212 1000\n* ### Company\n* [About](https://www.bloomberg.com/company/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Careers](https://www.bloomberg.com/company/what-we-do/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Inclusion at Bloomberg](https://www.bloomberg.com/company/values/inclusion/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Tech at Bloomberg](https://www.bloomberg.com/company/values/tech-at-bloomberg/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Philanthropy](https://www.bloomberg.com/company/philanthropy/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Sustainability](https://www.bloomberg.com/company/sustainability/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Beta](https://www.bloomberg.com/company/values/tech-at-bloomberg/bloomberg-beta/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Communications\n* [Press Announcements](https://www.bloomberg.com/company/announcements/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Press Contacts](https://www.bloomberg.com/company/press-contacts/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Follow\n* [Facebook](https://www.facebook.com/Bloomberglp)\n* [Instagram](https://www.instagram.com/bloomberg/)\n* [LinkedIn](https://www.linkedin.com/company/2494)\n* [YouTube](https://www.youtube.com/@bloomberglp)\n* ### Products\n* [Bloomberg Professional Services](https://bloomberg.com/professional?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Terminal](https://www.bloomberg.com/professional/products/bloomberg-terminal/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Data](https://www.bloomberg.com/professional/products/data/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Trading](https://www.bloomberg.com/professional/products/trading/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Risk](https://www.bloomberg.com/professional/products/risk/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Compliance](https://www.bloomberg.com/professional/products/compliance/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Indices](https://www.bloomberg.com/professional/products/indices/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Industry Products\n* [Bloomberg Law](https://pro.bloomberglaw.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Tax](https://pro.bloombergtax.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Government](https://about.bgov.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [BloombergNEF](https://about.bnef.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* ### Media\n* [Bloomberg Markets](https://www.bloomberg.com/markets/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Technology](https://www.bloomberg.com/technology/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Pursuits](https://www.bloomberg.com/pursuits/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Politics](https://www.bloomberg.com/politics/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Opinion](https://www.bloomberg.com/opinion/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Businessweek](https://www.bloomberg.com/businessweek/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Live Conferences](https://www.bloomberglive.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Radio](https://www.bloombergradio.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Television](https://www.bloomberg.com/live?utm_source=bloomberg-menu&utm_medium=graphics)\n* [News Bureaus](https://www.bloomberg.com/company/news-bureaus/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Media Services\n* [Bloomberg Media Distribution](https://www.bloomberg.com/distribution?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Advertising](https://www.bloombergmedia.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* ### Company\n* [About](https://www.bloomberg.com/company/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Careers](https://www.bloomberg.com/company/what-we-do/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Inclusion at Bloomberg](https://www.bloomberg.com/company/values/inclusion/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Tech at Bloomberg](https://www.bloomberg.com/company/values/tech-at-bloomberg/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Philanthropy](https://www.bloomberg.com/company/philanthropy/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Sustainability](https://www.bloomberg.com/company/sustainability/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Beta](https://www.bloomberg.com/company/values/tech-at-bloomberg/bloomberg-beta/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Communications\n* [Press Announcements](https://www.bloomberg.com/company/announcements/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Press Contacts](https://www.bloomberg.com/company/press-contacts/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Follow\n* [Facebook](https://www.facebook.com/Bloomberglp)\n* [Instagram](https://www.instagram.com/bloomberg/)\n* [LinkedIn](https://www.linkedin.com/company/2494)\n* [YouTube](https://www.youtube.com/@bloomberglp)\n* ### Products\n* [Bloomberg Professional Services](https://bloomberg.com/professional?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Terminal](https://www.bloomberg.com/professional/products/bloomberg-terminal/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Data](https://www.bloomberg.com/professional/products/data/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Trading](https://www.bloomberg.com/professional/products/trading/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Risk](https://www.bloomberg.com/professional/products/risk/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Compliance](https://www.bloomberg.com/professional/products/compliance/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Indices](https://www.bloomberg.com/professional/products/indices/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Industry Products\n* [Bloomberg Law](https://pro.bloomberglaw.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Tax](https://pro.bloombergtax.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Government](https://about.bgov.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [BloombergNEF](https://about.bnef.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* ### Media\n* [Bloomberg Markets](https://www.bloomberg.com/markets/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Technology](https://www.bloomberg.com/technology/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Pursuits](https://www.bloomberg.com/pursuits/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Politics](https://www.bloomberg.com/politics/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Opinion](https://www.bloomberg.com/opinion/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Businessweek](https://www.bloomberg.com/businessweek/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Live Conferences](https://www.bloomberglive.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Radio](https://www.bloombergradio.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Bloomberg Television](https://www.bloomberg.com/live?utm_source=bloomberg-menu&utm_medium=graphics)\n* [News Bureaus](https://www.bloomberg.com/company/news-bureaus/?utm_source=bloomberg-menu&utm_medium=graphics)\n### Media Services\n* [Bloomberg Media Distribution](https://www.bloomberg.com/distribution?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Advertising](https://www.bloombergmedia.com/?utm_source=bloomberg-menu&utm_medium=graphics)\n* ### Bloomberg\nConnecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world\n### For Customers\n* [Bloomberg Anywhere Remote Login](https://bba.bloomberg.net/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Software Updates](https://www.bloomberg.com/professional/support/customer-support/software-updates/?utm_source=bloomberg-menu&utm_medium=graphics)\n* [Manage Contracts and Orders](https://service.bloomberg.com/portal/sessions/new?utm_source=bloomberg-menu&utm_medium=graphics)\n### Support\nAmericas+1 212 318 2000\nEMEA+44 20 7330 7500\nAsia Pacific+65 6212 1000\n [/](/)Sign In[Subscribe](https://www.bloomberg.com/subscriptions?in_source=nav)Search[Live TV](/live)\n* [Markets](https://www.bloomberg.com/markets)Chevron Down\n* [Economics](https://www.bloomberg.com/economics)\n* [Industries](https://www.bloomberg.com/industries)\n* [Tech](https://www.bloomberg.com/technology)\n* [Politics](https://www.bloomberg.com/politics)\n* [Businessweek](https://www.bloomberg.com/businessweek)\n* [Opinion](https://www.bloomberg.com/opinion)\n* MoreChevron Down\nUS EditionChevron DownMenu[/](/)[Subscribe](https://www.bloomberg.com/subscriptions?in_source=nav-mobileweb) \n![Alt](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iqzvmKvEdd6g/v0/-1x-1.webp)A bath at the Takamiya _ryokan_, or traditional Japanese inn, in Yamagata.\nMarkets Magazine\n# Japan Unleashes Capitalism by Letting ‘Zombie’ Companies Die\nIn agonizing decisions for owners, a once-protective society lets the market take its course.\nBy [Kanoko Matsuyama](https://www.bloomberg.com/authors/AOYOCyqeQxk/kanoko-matsuyama "Kanoko Matsuyama")\nPhotographs by Noriko Hayashi\nOctober 2, 2025 at 7:00 PM EDT\nThe [Zao Onsen Ski Resort](https://www.zao-spa.or.jp/english/ "Zao Onsen ski resort") in northern Japan offers Aspen‑style deep powder and trails lined with its famed “snow monsters,” pine trees glistening with ice and snow from biting Siberian winds. For generations the Sato family operated a thriving lodging business nearby. Its flagship property, [Hotel Oak Hill](https://www.zao-oakhill.com/ "Hotel Oak Hill"), has 30 rooms, soaring ceilings and access to the region’s hot springs for après-ski unwinding.\n!["Snow Monsters" of Zao mountains in Yamagata, Japan.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iSdLNXTtchn4/v0/-1x-1.webp)“Snow Monsters” of Zao mountains in Yamagata, Japan. Source: Alamy\nThe company flourished in Japan’s post-World War II economic boom. Children from a famous Tokyo private school, known for educating aristocratic families, came every winter, their photos lining the hallways of the hotel’s more modest sister property. Called Yoshidaya, it was a ryokan, or traditional Japanese inn, that welcomed guests with communal baths, futons for sleeping and local cuisine. “The ryokan was our life, family and business,” says Chigusa Sato Nolen, 57, who grew up skiing at the resort and spent her early childhood living full time in the inn.\nBut during the long [stagnation](https://www.bloomberg.com/news/features/2024-09-24/japan-tsmc-s-plant-is-reviving-kumamoto-but-leaves-other-towns-behind "Japan: TSMC\'s Plant Is Reviving Kumamoto But Leaves Other Towns Behind") after Japan’s economic bubble of the 1980s burst, her family’s hotels struggled. They earned just enough to pay interest on their debts, joining the growing ranks of companies that survived in part because the Bank of Japan slashed interest rates to zero so borrowers wouldn’t default on their loans. By 2010 almost 1 in 5 companies limped along because of bailouts and other financing relief, rather than tackling their fundamental problems. In 2008 three economics professors, one from Japan and two from the US, coined a term for these enterprises: “zombie companies.”\n![Chigusa Sato Nolen working at Yoshidaya Ryokan](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/i8FvbconM10I/v0/-1x-1.webp)Chigusa working at Yoshidaya.\nProprietors were reluctant to shut them down. Unlike in the US, where President Donald Trump bounced back from his companies’ bankruptcies, the stigma of failure runs deep in a business culture defined by diligence and honor. Lifelong commitment to workers, who offer loyalty in return, is the norm. “If you travel outside of big cities, you’ll see quite a few places where you wonder how these people are making a living at all,” says Kazuyoshi Komiya, a consultant who helps small and medium-size companies with turnarounds. “Japan has been overprotective of them.”\n![Bloomberg Markets Japan Issue](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/i2730BC9dGrw/v0/-1x-1.webp)Featured in the October/November issue of [_Bloomberg Markets_](https://www.bloomberg.com/markets-magazine "Bloomberg | Markets Magazine"). Photographer: Ben Weller for Bloomberg Markets\nNow the country’s financial landscape is [changing](https://www.bloomberg.com/news/features/2025-07-24/right-wing-populism-is-pushing-japan-into-rare-political-turmoil "Japan’s Ruling Party Struggles as Prices Rise, Yen Sinks"). In March 2024 the Bank of Japan [increased](https://www.weforum.org/stories/2024/03/japan-ends-negative-interest-rates-economy-monetary-policy/ "Japan ends era of negative interest rates. A chief economist explains") interest rates for the first time in 17 years, reflecting an improving economic climate and rising prices. The Japanese government is embracing what mid-20th century economist Joseph Schumpeter called capitalism’s “creative destruction,” the need for innovators to replace weaker rivals. Regulators have been pushing companies to improve their governance, and recently Japan passed a law to streamline restructurings and encourage turnarounds. “It’s important for the corporate sector to see healthy regeneration,” [Yasuo Goto](https://researcher.seijo.ac.jp/html/100000282_en.html "Yasuo Goto bio"), a professor at Seijo University who studies zombie companies, said in a presentation last year at the economy ministry’s Small and Medium Enterprise Agency. The Tokyo Stock Exchange is pushing underperforming public companies to become more profitable and engage more with investors. In 2024 the number of zombie companies began to edge lower for the first time in seven years. In the year ended in March, bankruptcies climbed 13%, to 10,070, the most since 2014.\n### Letting Go\nCorporate bankruptcy filings in Japan, years ended March 31\nSource: Teikoku Databank\nFor now bankruptcies have yet to claim larger companies, and people have generally been able to find [new jobs](https://www.bloomberg.com/news/features/2025-06-10/japan-finance-jobs-boom-as-global-banks-battle-for-top-talent "Japan Finance Jobs Boom as Global Banks Battle for Top Talent"), thanks to a chronic [labor shortage](https://www.bloomberg.com/news/features/2025-03-06/thousands-of-cat-eared-robots-are-waiting-tables-in-japan "Thousands of Cat-Eared Robots Are Waiting Tables in Japan") in Japan. But Schumpeter also recognized the potential for the dislocation of failing companies to lead to social unrest. “A small cohort of winners versus everyone else risks destabilizing the economy and society,” Goto says.\nAcross town from Oak Hill, Zao Center Plaza—a \xadcommercial complex with a hot spring, an inn and \xadrestaurants—went bust almost two years ago. Employees lost their jobs, and the buildings were razed. All that was left was roughly ¥1 billion ($6.8 million) in unpaid loans and a patch of dirt. “The debt repayments were a huge burden,” says Masaru Funami, who was hired by the owners to run the facility and was the last to turn off the lights. The plaza made money only two months out of the year and couldn’t raise prices enough to cover costs. “Even if someone had stepped in, it would have been tough to turn it around,” Funami says. “In the end it was the right time to let go.”\nLast year the Sato family faced the same painful choice.\n![An outdoor bath, fed by an onsen, or hot spring, in Yamagata.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/im1KeZ.NsTCc/v0/-1x-1.webp)![Okama Crater Lake, which sits below the Zao Mountains.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/i4jojQ5aELVY/v0/-1x-1.webp)![The Zao Onsen district of Yamagata.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/i2qvL6tuqBnA/v0/-1x-1.webp)![A mountain view at Zao Onsen Ski Resort.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/ij88z3iFFO28/v0/-1x-1.webp)An outdoor bath, fed by an _onsen_, or hot spring, in Yamagata Okama Crater Lake, which sits below the Zao Mountains.The Zao Onsen district of Yamagata.A mountain view at Zao Onsen Ski Resort.\nFor more than a millennium, travelers have journeyed to the Zao Mountains, drawn by their restorative hot springs. The Yoshidaya ryokan first appears in records from the early 1800s, when shoguns, or military rulers, still governed Japan. Situated in the Zao Onsen district of the city of Yamagata, it was among 17 original inns that welcomed summer visitors seeking healing waters after the snow had melted.\nAbout a century ago, the area began reinventing itself as a winter getaway, opening its first slope to those eager to adopt novel Western sports such as skiing. In a booming economy, tourism surged further in the 1960s, and a new highway later improved access. The expanding middle class flocked to ski resorts in winter and beaches in summer.\nBuilt with wood and last updated in the 1950s, the Sato family’s Yoshidaya was a creaky labyrinth of two dozen rooms scattered over three floors. Despite the steep stairs and lack of toilets in some rooms, Gakushuin, the celebrated school in Tokyo for the gentry, chose to send its students every winter to learn to ski.\nFor Chigusa Sato, whose big eyes and easy smile have long put customers at ease, Yoshidaya was simply home, with life shaped by the rhythm of arriving guests. Her father, a skilled skier, sent his son and daughter out on the trails with the students. Flipping through an album, she shares pictures of the ryokan full of people, her family, staff and guests. In one photo she’s standing next to the current emperor’s younger sister. Chigusa lived on the second floor of Yoshidaya until she was about 8, when her father built a large house nearby. But she always ate at the ryokan.\n![A photograph of Zao Ski Resort taken about 100 years ago in the early Showa era at Takamiya Ryokan. ](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iaAz_QI.4GeY/v0/-1x-1.webp)A photograph of Zao Ski Resort taken about 100 years ago in the early Showa era at Takamiya Ryokan. ![The Okazaki family in front of Takamiya with the current Emperor Emeritus (front row, second from left) during his youth, when he visited Zao in 1951.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/ipQDGDvonMbU/v0/-1x-1.webp)The Okazaki family in front of Takamiya with the current Emperor Emeritus (front row, second from left) during his youth, when he visited Zao in 1951.\nAs Japan’s economy swelled in the ’80s, there was plenty of disposable income to spend on Hondas, Walkmans and Ralph Lauren polo shirts. Businesses thrived, and many families eagerly spent their rising fortunes.\nWhen [Subaru Corp.](https://www.bloomberg.com/profile/company/7270:JP "Subaru Corp.") introduced a new four-wheel drive that could climb the region’s snowy roads, “my father bought one,” Chigusa says. “I asked for a red car, so that’s what he got.” To prepare her brother, Naoki, to take over the business, their father sent him to hotel management school, then later on a two-week tour of Europe’s grandest hotels.\nIn 1989, rattled by a rumor that the Gakushuin school was scouting for better locations for its winter trips, Chigusa’s father decided to expand. He picked a plot uphill from Yoshidaya, just a five-minute walk from the ski lifts, and borrowed ¥400 million to build a new hotel. He named it Hotel Oak Hill, after the trees on the land, and put Naoki in charge. In an era of easy money and soaring land prices, Chigusa’s father took out an additional ¥200 million loan to add an open-air bath.\nThat year the [Nikkei 225 stock index](https://www.bloomberg.com/quote/NKY:IND "NKY Index") peaked after rising sixfold over the prior decade; investors began to lose faith that Japanese manufacturing and finance would dominate the world. The property market cooled, fueling a vicious cycle of falling asset prices and souring bank debt. As deflation deepened in the 1990s, people cut back on leisure travel. Annual visitors to the area halved from a prior peak of 2 million.\nAs bookings dried up, Chigusa’s father and brother struggled to rein in escalating costs. They’d misjudged the heating demands of the expansive great room; cutting energy expenses meant guests shivering in the cold.\n> “One mistake meant firing employees. Some of them were old and would have found it difficult to find jobs again.”\nDespite its modern appearance, the hotel operated more like a traditional ryokan, where breakfast and dinner are included with lodging. A typical evening meal began with seasonal appetizers and sashimi, followed by a hot pot of thinly sliced premium Yamagata beef and vegetables in a simmering broth; grilled fish; rice and soup, then finished with fruit and cake for dessert.\nThe effort needed to prepare and serve multicourse meals created high fixed costs, but cutting back threatened to turn away guests, especially loyal seasonal visitors. The need to lay out futons in tatami straw-mat rooms every night also kept labor costs high.\nGakushuin kept sending its students every winter, but that wasn’t enough to make up for the drop in tourism revenue. Chigusa’s father and brother cut the price of overnight stays in half, to about ¥6,000. They managed to stay open partly because they were the only place in town that welcomed travelers with pets.\nDuring those years, Chigusa left her home, which she says gave her “no breathing room,” to study in Tokyo. After gaining some fluency in English, she found work on a US military base, where she met her American husband and added Nolen to her last name. Chigusa returned to the resort area in 2009 to care for her ailing mother. They muddled along, her father and brother barely keeping the business afloat. Virtually every yen went toward interest on the ¥600 million loan, while the principal remained untouched.\nSeven years ago, Naoki suddenly died of a stroke at age 54. Chigusa believes the stress of trying to keep the doors open for far too long killed him. Then, less than a year later, her father died of cancer. Chigusa inherited the inn and hotel and the debt.\nAround the same time, her nephew, Yuki Sato, was getting ready to graduate from university. He’d long dreamed of attending hotel management school after college, but those plans faded as the family business declined. When his father died, Yuki rushed home—still burdened with hefty student loans.\n![Yoshidaya in the Zao Onsen district of Yamagata.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iaXUCUiZq_70/v0/-1x-1.webp)Yoshidaya in the Zao Onsen district of Yamagata.\nAgain living on-site, aunt and nephew divided \xadresponsibilities: Chigusa oversaw Yoshidaya, while Yuki managed Oak Hill. They learned for the first time the precarious state of the enterprise’s finances. They were losing more than ¥30 million each year. “We couldn’t make a move,” Yuki says. “One mistake meant firing employees. Some of them were old and would have found it difficult to find jobs again.”\nThey scraped by for a few more years, and even looked for a buyer for Oak Hill, but failed because the property didn’t include rights to the hot spring. They leased access instead. In April 2024 a senior manager from [Kirayaka Bank Ltd.](https://www.bloomberg.com/profile/company/8520:JP "Kirayaka Bank Ltd."), instead of their usual banker, appeared, asking to speak with Chigusa and Yuki.\nHe explained, bluntly, that the bank could no longer carry their loan, leaving them to find alternate financing or lose the company. [Jimoto Holdings Inc.](https://www.bloomberg.com/profile/company/7161:JP "Jimoto Holdings Inc."), the bank’s parent company, says it took the step because of a change in policy: It no longer worked with businesses unlikely to see improved cash flow. (The company declined to comment on specific borrowers.) Ultimately, Chigusa chose bankruptcy proceedings to liquidate the company, which owed the bank ¥597 million, according to a regulatory filing. As insolvency proceedings began, she drew up a short list of anyone who might buy the hotels and keep them running.\nThat’s when the Okazakis stepped in. Like Chigusa’s family, the Okazaki clan had been in the region for generations, operating another ryokan—Takamiya, just up the street—since the early 1700s. They’d been able to weather Japan’s long stagnation by avoiding the construction rush of the bubble years while diversifying their customer base.\nAs the economy began to pick up, the Okazakis found a winning strategy: modernizing facilities, controlling expenses and embracing online reservations to tap into a growing wave of international ski tourists attracted to Zao’s powder and picturesque runs.\nThe Okazakis, who already operated six other accommodations in the area, bought the Chigusa family’s real estate from the lender. The price wasn’t publicly disclosed, but Yuki says it was a fraction of what their company owed the bank.\nHiroto Okazaki, who runs the family business with his father, says he invested an additional ¥25 million in upgrades at the hotel. He replaced futons with modern beds, installed locks linked to a digital check-in system and switched to buffet-\xadstyle breakfasts as the only meal offered. The hotel was renamed the Onsen & Stay Oak Hill, to promote its hot springs, or _onsen_ in Japanese.\n![Okazaki in front of Takamiya ryokan.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iRqVIteisHJQ/v0/-1x-1.webp)Okazaki in front of Takamiya ryokan.\nLike Chigusa and her nephew, Okazaki grew up in his family’s ryokan before moving into a new house. The two boys skied on the same local team. Hiroto went to the US for an elite education, at Cornell University, where he studied at the school’s top-ranked hotel management program. He returned to work for his father.\nAs chief operating officer, Okazaki oversees the Takamiya group’s annual sales of about ¥3 billion. Instead of working at an office, he roams among various properties, restaurants and projects in his mother’s Mercedes-Benz wagon. He tries to visit all the hotels in the area about once a week and prefers to drop in casually to talk to managers and staff.\nSo far, Okazaki’s turnaround strategy is paying off: Oak Hill is targeting ¥195 million in sales this year and a profit margin of 30%. “We knew we could run it better and revive them,” says Okazaki, 35 and athletic, wearing a North Face T-shirt, khaki shorts and sandals. “It’s an important business that failed to cater to a new customer base.” The Okazakis also bought at a good time, as Japan enjoys a surge in tourism.\nWhen it came to Oak Hill, one crucial decision was obvious from the start. “We rehired their employees and previous owners,” Okazaki says of asking Chigusa and her nephew, Yuki, to stay. “They’re conscientious people and favored by many, many customers.”\nYuki, who’d assumed he’d have to find a new job, stayed on. With his mother, he still lives in a small house attached to Oak Hill and earns a monthly salary of about ¥250,000. While that’s only two-thirds of the national average, it’s stable, and he doesn’t have to pay rent or for most of his meals. When times were tough at Oak Hill, Yuki often skipped his own paychecks to preserve cash and used his personal credit card to cover expenses.\n![Yuki Sato at Oak Hill.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/i1I8a08b4CU4/v0/-1x-1.webp)Yuki Sato at Oak Hill.\nAs hotel manager, he still works a packed day—stocking supplies, handling guest checkouts, guiding visitors to breakfast and managing cleanup. Despite the busy routine, joining a larger operation means he now can take vacation days. “It’s much easier to take time off now,” Yuki says, wearing the hotel’s new uniform, a green shirt and black pants.\nRecently the new bosses approved a proposal from Yuki and other employees to turn part of the great room, with its floor-to-ceiling views of the forest, into a cafe. Construction is already underway, with the goal of opening for business this year and drawing in more nonhotel patrons while offering guests a place to relax during the day. “In the past, I couldn’t act on ideas because there was no budget,” Yuki says. “I’ve never seen the upside of the business before, so that’s new to me.”\n![The Oak Hill breakfast buffet.](https://assets.bwbx.io/images/users/iqjWHBFdfxIU/iIFyi6nJ_9KU/v0/-1x-1.webp)The Oak Hill breakfast buffet.\nFor her part, Chigusa says she’s glad to have a new job, even if it’s a variation on the one she had as an owner. Separated from her husband, she lives alone in a small room tucked behind Yoshidaya’s reception area. After her mother died in 2020, she emptied out her family’s six-bedroom house and now rents it out to ski instructors each winter.\nChigusa had to give up her leased red Nissan Leaf when the business collapsed, and she bought a used van from the hotel for ¥30,000. She wakes up before 5:30 a.m. every day and cooks breakfast for herself and the staff before saying goodbye to departing guests.\nLife isn’t easy, but for Chigusa the bankruptcy lifted an enormous burden, a testament perhaps to the redemptive possibilities of allowing the forces of creative destruction to take hold. Rather than a failure, she says, she considers the closure of her family’s centuries-old business one of her “proudest and most necessary decisions.” “If I passed the business to Yuki, he would have ended up like his father, and that’s too much for me to bear,” she says, tears welling up. “I knew I had to somehow put an end to it.”\n_Matsuyama reports from Bloomberg’s Tokyo bureau._\n* Editors: Reed Stevenson, John Hechinger\n* Photo Editor: Donna Cohen\n[/](/)[Terms of Service](/tos)Do Not Sell or Share My Personal Information[Trademarks](/trademarks)[Privacy Policy](/privacy)[Careers](/careers/?utm_source=dotcom&utm_medium=footer)[Made in NYC](https://nytech.org/made)[Advertise](https://www.bloombergmedia.com/contact)[Ad Choices](/privacy#advertisements)[Help](/help)©2025 Bloomberg L.P. All Rights Reserved. ', ['Post Content: Is that you, Gemini? Come in and make yourself at Home 🏠 \n\nSign up for updates: https://t.co/V85WgPJvQN https://t.co/JJaVRW385A', "Images or Videos in post: [{'url': 'https://video.twimg.com/amplify_video/1962909594390257664/vid/avc1/1080x1080/B5KJ1rGMgPJL0sFG.mp4?tag=16', 'type': 'video', 'thumbnailUrl': 'https://pbs.twimg.com/media/Gz2n5htWcAAbeaw.png'}]"]]