By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: MCP Demystified: What Actually Goes Over the Wire?? | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > MCP Demystified: What Actually Goes Over the Wire?? | HackerNoon
Computing

MCP Demystified: What Actually Goes Over the Wire?? | HackerNoon

News Room
Last updated: 2025/06/22 at 10:24 PM
News Room Published 22 June 2025
Share
SHARE

Everyone’s dissecting MCP like it’s the Rosetta Stone of AI. Diagrams! Whitepapers! Thought leaders making TikToks!

But where, my dear data druids, are the actual packets? Here’s what’s driving me nuts: nobody’s showing the actual packets.

It’s like trying to learn surgery from a motivational poster. Sure, the theoretical framework is *inspiring*, but I want to see the blood and guts! Where’s the raw JSON? Where are the stdin/stdout dumps? How am I supposed to animate my creation when I can’t even see the lightning?

Trying to learn a protocol without wire dumps is like trying to learn fencing from a PowerPoint.

Give me the electrons or give me death.

Now that we have our mantra, let’s dig in!

Is this your business model?

Your MCP Business Plan:

  1. ✅ Learn MCP
  2. ❓ ??? (…a miracle occurs…)
  3. 💰 PROFIT

Sound familiar? Of course it does! It’s the startup equivalent of turning lead into gold. Alchemy, but with a LOT more JSON.

Fortunately…

#You’re Not Alone!

It would appear that reams of ink have been spilled over this so-called Model Context Protocol. Everyone’s talking about LLMs evolving, tool calling, function calling, bidirectional this and capability that. I, of course, am no exception. But one thing kept clawing at my brain stem as I strolled down this cheerful meadow path of theory and abstraction:

“What’s this look like ON THE WIRE???”

Sure, the whole protocol is documented at modelcontextprotocol.io, but I don’t want philosophy. I want to see the bits. I want to feel like Dr. Frankenstein tending to my golem, lightning and all. Not wading through another semantic haystack of obscure BNF-flavored syntax grammar.

Woe was I, a mere sojourner among the would-be-enlightened, until I cracked it open myself thusly!

Initial Observations From the Field

MCP is very, very new. Bleeding edge. Possibly still bleeding. Perhaps one day, entire books will be written about its internal politics, like some kind of digital constitutional convention. And let’s be real — that book will be sold as an NFT.

But for now, one thing stands out above all:

TOOL CALLING IS ALL YOU NEED

If we zoom way, way in on this single, critical use case — tool invocation — we can ignore a lot of the fluff. We can simplify until we’re staring at the beautiful skeleton of the system: just four message types for output, and three for input. (We’ll explain why the numbers don’t match exactly a little later.)

  • initialize
  • initialized
  • tools/list
  • tools/call
  • (and the 3 results that come back)

Yes, that’s it. You can understand the entire protocol by knowing about these seven messages.

We’ll use weather.py — a simple MCP server example that lets you query forecast and alert data. You can find it in their GitHub here: [https://github.com/modelcontextprotocol/quickstart-resources/blob/main/weather-server-python/weather.py].

Let’s talk about what actually happens “on the wire” when you talk to this thing.

First off: MCP is transport agnostic. That means the spec doesn’t care how you connect to the server. HTTPS? Web Sockets? Named pipes? Tapping into an ancient Ethernet vampire cable? Doesn’t matter.

In practice, the most common method is to launch the server as a subprocess and communicate over stdin and stdout. This gives you a nice little private communication channel. (Technically full-duplex, but if you add in stderr, we’ll call it 1.5-duplex to be cute).

How Messages Flow

MCP uses JSON-RPC 2.0 over the wire. This gives you a request/response protocol, plus notifications.

Each message is sent as one per line of JSON. It’s like a digital telegram service where each line is a complete package of meaning.

Is this in the spec? Sort of; not really. The spec leaves it open. But this newline-delimited JSON is the default idiom.

The Myth of “stdin/stdout Considered Harmful”

Yes, you may find some dramatic blog posts claiming this pattern is “dangerous” or “unreliable.” Maybe they even wrote it in all-caps: “STDIN/STDOUT MCP COMMUNICATION CONSIDERED HARMFUL.”

Seems scary! We don’t want to start a fire in the lab, do we? Why would people say this if it weren’t true?

You see, in Ye Olden Times, extra-long lines could cause buffer overflows (hello, Morris Worm). Fortunately, most modern software doesn’t suffer from this problem (as much! Fingers crossed!). Modern JSON parsers are both fast and resilient. Want to send a 256MB weather alert? Go for it. Just… maybe don’t do it every second.

These warnings usually come from people manually wiring file descriptors in C. You are not doing that. You are using Python. Specifically, the subprocess module that’s built precisely for this kind of task.

# fork off an MCP subprocess

import subprocess as sp

# Redirect stderr to /dev/null
stderr = sp.DEVNULL

# Start the weather.py process
process = sp.Popen(
    ["uv", "run", "weather.py"],
    stdin=sp.PIPE,
    stdout=sp.PIPE,
    stderr=stderr
)

Easy, right? All native Python. No extra packages. Batteries included.

The subprocess module replaces the child’s I/O streams with pipes, giving you complete dominion. We can also send our new process signals if we so desire.

Now, we’re finally ready to do protocol science like a proper digital necromancer!


Plot Twist: It’s Embarrassingly Simple

MCP, in all its enterprise-robe-and-scepter regalia, is just this:

YOU: "Hello, I speak robot."
SERVER: "Delightful. I also speak robot."
YOU: "Excellent, we both can speak robot!"
YOU: "What tricks can you do?"
SERVER: "I can juggle and summon storms."
YOU: "Storm, please!"
SERVER: "⛈️ As you wish."

That’s literally it. This entire protocol is just a very formal way of asking someone to do a thing and getting confirmation that they did it.

The rest is just enterprise-grade ceremony and glitter glue around this beautifully simple handshake.

That’s it. Seven polite exchanges.

Stitching the Monster Together: A Working Prototype

Time to get our hands dirty. We’re going to spawn a weather server and interrogate it like a proper mad scientist.

Tip #1: The best way to understand a protocol is to abuse it until it screams, then patch it until it cooperates. Works on code, houseplants, and occasionally old Subarus.

Tip #2: If you torture your LLMs, I won’t be held responsible for what happens to you in The After Times When AI Taketh Over!

Step 1: Summoning Our Digital Minion


# Behold! We create life!
# (fork off an MCP subprocess)

import subprocess

# Start the weather.py process
# Use `uv run` to execute the script in the uv environment
# Redirect stdin, stdout to PIPE for communication
# Redirect stderr to DEVNULL to suppress error messages

process = subprocess.Popen(
    ["uv", "run", "weather.py"],
    stdin=subprocess.PIPE,
    stdout=subprocess.PIPE,
    stderr=stderr)

# Behold! We create life!
process = subprocess.Popen(
    ["uv", "run", "weather.py"],
    stdin=subprocess.PIPE,
    stdout=subprocess.PIPE,
    stderr=subprocess.DEVNULL  # Silence the screams
)

The subprocess module replaces the child’s I/O streams with pipes, giving you complete control over the conversation. It’s beautifully simple: you now have a dedicated child process at your beck and call, ready to execute whatever you throw at it.

Need to get aggressive? You can even send signals to really show who’s boss.

The best part? Under normal conditions, when the parent process exits, the child goes down with the ship—so you don’t have to worry about zombie processes cluttering up your system (well, mostly).

Quick aside: You’ll notice we’re using uv to spawn the process. This isn’t required, but uv is rapidly becoming the gold standard for modern Python tooling. If you haven’t checked it out yet, you absolutely should—it’s a game-changer.

If you’re still using pip, we need to talk.

Step 2: The Awkward First Date

Every good relationship starts with mutual identification. We will start by announcing ourselves (like a gentleman):


# Define the initialize request
init_request = {
  "jsonrpc": "2.0",
  "id": 1,
  "method": "initialize",
  "params": {
    "protocolVersion": "2025-03-26",
    "capabilities": {},
    "clientInfo": {
      "name": "MyMCPClient",
      "version": "1.0.0"
    }
  }
}

This is our opening salvo—the first packet we send to establish communication. It tells the server three crucial things: we speak MCP, which version we’re using, and what capabilities we bring to the table.

You’ll notice this follows the JSON-RPC 2.0 format, and that’s going to be consistent throughout our entire conversation with the server. Every message, every response—all JSON-RPC 2.0.

Let’s fire it off.


import json


def send_mcp_request(process, request):
    """Sends a JSON-RPC request to the subprocess's stdin."""
    json_request = json.dumps(request) + 'n' # Add newline delimiter
    process.stdin.write(json_request.encode('utf-8'))
    process.stdin.flush() # Ensure the data is sent immediately


# 1. Send the initialize request
print("Sending initialize request...")
send_mcp_request(process, init_request)

We should see this output:


Sending initialize request... 

Now, let’s hear what the server has to say about everything so far:


def read_mcp_response(process):
    """Reads a JSON-RPC response from the subprocess's stdout."""
    # Assuming the server sends one JSON object per line
    line = process.stdout.readline().decode('utf-8')
    if line:
        print("     . . . len is", len(line))
        return json.loads(line)
    return None


print("Sending initialized request...")
send_mcp_request(process, notified_request)

The server, being polite, introduces itself back:


     . . . len is 266
Received response after initialization:{'id': 1,
 'jsonrpc': '2.0',
 'result': {'capabilities': {'experimental': {},
                             'prompts': {'listChanged': False},
                             'resources': {'listChanged': False,
                                           'subscribe': False},
                             'tools': {'listChanged': False}},
            'protocolVersion': '2025-03-26',
            'serverInfo': {'name': 'weather', 'version': '1.9.4'}}}

Translation: “Hello! I’m a weather server, I won’t randomly change my abilities mid-conversation, and I definitely won’t ghost you.”

So, what’s the server actually telling us here? It’s sharing protocol version info and basic details, but the capabilities section is the real prize.

We’re ignoring capabilities for this demo, but check this out: we could subscribe to “listChanged” events if our server were the type to add or remove tools dynamically. There’s actually a sneaky little pub/sub system hiding in the MCP protocol—you can listen for all sorts of events. Our weather.py server is way too simple for any of that fancy stuff, but it’s there if you need it.

Same deal with “prompts” and “resources”—we’re skipping them entirely. Sure, we could roll our own implementation, but that misses the point of API separation. The whole idea is that different systems handle different concerns, so you don’t have to reinvent every wheel. You can pick and choose which parts of the protocol to implement, but if you want to play nice with other MCP tools, you’d better stick to the spec.

Alright, we’re connected and ready to rock, right?

Wrong.

The server’s still sitting there, tapping its digital foot, waiting for us to complete the handshake. We need to send the “all good on our end” signal:


notified_request = {
  "jsonrpc": "2.0",
  "method": "notifications/initialized"
}

Notice the missing id field? That’s JSON-RPC’s way of saying “fire and forget”—no response expected or required. Unlike the usual request/response dance we’ve been doing, this is a notification. Think of it as sending an ACK packet: “Hey server, I’m ready to roll!”

No id means: “Don’t reply, I’ll trust you know what to do with this information.” Like giving compliments to your cat.

Meanwhile… back at the ranch… (back to the story!)

Remember that the server, being reasonably courteous, already replied with a manifesto of its abilities. We want to nod approvingly.

So, let’s confirm: yes, we are indeed ready to party.

# yes we are indeed ready to party

# Acknowledge the server so it knows we approve
print("// Sending initialized request...")
send_mcp_request(process, notified_request)

Now, the server knows to start expecting requests.

Step 3: “Show Me What You Got”

Time to see what toys this server brought to the playground:


tools_list_request = {
  "jsonrpc": "2.0",
  "id": 2,
  "method": "tools/list",
  "params": {
  }
}


# 2. Send the tools/list request
print("// Sending tools/list request...")
send_mcp_request(process, tools_list_request)

We get precisely the output that we expected…Perfect!


// Sending tools/list request...

Now, let’s read back the output… Time to see what treasures we’ve uncovered:


# Read the server's response to the tools/list request                                                                                    
tools_list_response = read_mcp_response(process)
print("// Received tools list response:", end='')
pprint(tools_list_response)

And our server proudly displays its wares: forecasts, alerts, and other meteorological mischief.


     . . . len is 732
// Received tools list response:{'id': 2,
 'jsonrpc': '2.0',
 'result': {'tools': [{'description': 'Get weather alerts for a US state.n'
                                      'n'
                                      '    Args:n'
                                      '        state: Two-letter US state code '
                                      '(e.g. CA, NY)n'
                                      '    ',
                       'inputSchema': {'properties': {'state': {'title': 'State',
                                                                'type': 'string'}},
                                       'required': ['state'],
                                       'title': 'get_alertsArguments',
                                       'type': 'object'},
                       'name': 'get_alerts'},
                      {'description': 'Get weather forecast for a location.n'
                                      'n'
                                      '    Args:n'
                                      '        latitude: Latitude of the '
                                      'locationn'
                                      '        longitude: Longitude of the '
                                      'locationn'
                                      '    ',
                       'inputSchema': {'properties': {'latitude': {'title': 'Latitude',
                                                                   'type': 'number'},
                                                      'longitude': {'title': 'Longitude',
                                                                    'type': 'number'}},
                                       'required': ['latitude', 'longitude'],
                                       'title': 'get_forecastArguments',
                                       'type': 'object'},
                       'name': 'get_forecast'}]}}

Whoa! That’s a lot of JSON! Alright, let’s dig into the good stuff.

See that tools array up there? That’s our goldmine—each object represents a tool the LLM can invoke. (Plot twist: we can call them too, which is exactly what we’re doing right now!)

Fun fact: The ‘description’ fields are how your LLM decides what function to call. Think of it as Tinder for tools, with your AI looking at its phone trying to decide whether to swipe left or right

One interesting side note: OpenAI originally tried to brand this as “function calling”—which is… technically accurate, I suppose. But somewhere along the way, the industry collectively decided “tools” sounded cooler (or maybe more accessible?), and now we call the whole dance “tool calling.”

Anatomy of a tool (pay attention, this is where the magic lives):

  • name: The function’s actual name—no typos allowed here! This is the definitive, canonical name of the tool; we’ll use it to refer to this tool when we call it.
  • description: Plain English for the LLM to read (this is literally what the AI looks at when deciding whether to use your tool)
  • inputSchema: JSON Schema defining what parameters you need
  • outputSchema: Conspicuously missing! Everything just returns a “big string” and hopes for the best

Why no output schema? Because we’re all basically returning JSON wrapped in strings and pretending it’s a design decision. It’s like the wild west, but with more type safety anxiety.

This actually makes sense when you consider the evolution from “function calling” to “tool calling.” Traditional functions can return any type, while Unix command-line tools (which the naming somewhat implies) just spit out text. Of course, it’s mostly JSON under the hood, so there’s still structure lurking beneath the surface.

Some models even have switches to force JSON output, so in theory, your LLM could expect structured data every time. Then again, tools might return plain text, CSV, HTML, or really anything. Multi-modal models could even return audio, images, video, or live object detection feeds—the possibilities are wonderfully chaotic.

Alright, we’ve got our toolbox loaded. Let’s make our first MCP tool call!

Step 4: The Moment of Truth

Anyway, we’ve got a list of tools, now what? Let’s call one!

We select a tool. Let’s pick get_alerts to keep things simple since we don’t need latitude and longitude. If we had GPS data, get_forecast would be a better choice.


tools_call_request = {
    "jsonrpc": "2.0",
    "id": 3,
    "method": "tools/call",
    "params": {
        "name": "get_alerts",
        "arguments": {
            "state": "TX"
        }
    }
}

I picked Texas because everything’s bigger there, including the weather disasters.

Wait, hold up—why is it “tools/call” and not “tool/call”? I mean, we’re calling one tool, right?

Okay, sure, “tool/call” would sound more natural in English, but apparently, consistency with the other endpoints wins out. Plus, the spec says “tools/call,” so… we fall in line like good little developers.

With all our data ducks in a row now, we can press the Big Red Button.


# 3. Send the tools/call request                                                                                                          
print("// Sending tools/call request...")
send_mcp_request(process, tools_call_request)

# Read the server's response to the tools/call request                                                                                    
tools_call_response = read_mcp_response(process)
print("// Received tools call response:", end='')
pprint(tools_call_response)

[BEEP BEEP BOOP BOOP] (This is the sound the Big Red Button makes)

[Drumroll please]

The server’s thinking… processing… and… and…

[And the crowd goes wild!]

Voilà! Real weather data materializes. Alerts, floods, tornadoes, the works. All wrapped in structured JSON, just like your therapist ordered: (trimmed down, no one wants to see 11 pages of JSON dumps)


// Sending tools/call request...
     . . . len is 51305
// Received tools call response:{'id': 3,
 'jsonrpc': '2.0',
 'result': {'content': [{'text': 'n'
                                 'Event: Flood Advisoryn'
                                 'Area: Hidalgo, TX; Willacy, TXn'
                                 'Severity: Minorn'
                                 'Description: * WHAT...Flooding caused by '
                                 'excessive rainfall continues.n'
                                 'n'
                                 '* WHERE...A portion of Deep South Texas, '
                                 'including the followingn'
                                 'counties, Hidalgo and Willacy.n'
                                 'n'
                                 '* WHEN...Until 245 PM CDT.n'
                                 'n'
                                 '* IMPACTS...Minor flooding in low-lying and '
                                 'poor drainage areas.n'
                                 'n'
                                 '* ADDITIONAL DETAILS...n'
                                 '- At 205 PM CDT, Doppler radar indicated '
                                 'heavy rain due ton'
                                 'thunderstorms. Minor flooding is ongoing or '
                                 'expected to beginn'
                                 'shortly in the advisory area. Between 2 and '
                                 '5 inches of rainn'
                                 'have fallen.n'
                                 '- Additional rainfall amounts up to 1 inch '
                                 'are expected overn'
                                 'the area. This additional rain will result '
                                 'in minor flooding.n'
                                 '- Some locations that will experience '
                                 'flooding include...n'
                                 'Harlingen, Elsa, Edcouch, La Villa, Lasara, '
                                 'La Villa Highn'
                                 'School, Monte Alto, Jose Borrego Middle '
                                 'School, Satiagon'
                                 'Garcia Elementary School, Edcouch Police '
                                 'Department, Edcouchn'
                                 'City Hall, Edcouch Volunteer Fire '
                                 'Department, Edcouch-Elsan'
                                 'High School, Laguna Seca, Carlos Truan '
                                 'Junior High School,n'
                                 'Elsa Police Department, Lyndon B Johnson '
                                 'Elementary School,n'
                                 'Elsa Public Library, Olivarez and Lasara '
                                 'Elementary School.n'
                                 '- http://www.weather.gov/safety/floodn'
                                 "Instructions: Turn around, don't drown when "
                                 'encountering flooded roads. Most floodn'
                                 'n'
                                 'The next statement will be issued Tuesday '
                                 'morning at 830 AM CDT.n',
                         'type': 'text'}],
            'isError': False}}

Victory! Notice that isError: false—always check this field unless you enjoy debugging mysterious failures at 3 AM.

Looks good! We’re getting clean string data back (no streaming to complicate things), which gives us options. We could parse this weather data and massage it for the LLM, or just pass the raw response along and let the model figure it out. Most implementations go with the latter approach—why do extra work when LLMs are pretty good at parsing structured text?

But if you’re building something sophisticated, pre-processing the tool output can be incredibly powerful. You could format it, filter it, combine it with other data sources, or transform it into exactly what your application needs.

And that’s a wrap! We’ve successfully registered an MCP server, initialized the connection, called a tool, and processed the results. The entire MCP dance, from handshake to data retrieval. Are your legs tired yet?


The Full Monty: Your Complete MCP Client

Here’s the whole glorious ritual in one summoning circle:

import subprocess
import json
from pprint import pprint

def send_mcp_request(process, request):
    json_request = json.dumps(request) + 'n'
    process.stdin.write(json_request.encode('utf-8'))
    process.stdin.flush()

def read_mcp_response(process):
    line = process.stdout.readline().decode('utf-8')
    return json.loads(line) if line else None

requests = {
    'init': {
        "jsonrpc": "2.0", "id": 1, "method": "initialize",
        "params": {
            "protocolVersion": "2025-03-26",
            "capabilities": {},
            "clientInfo": {"name": "MyMCPClient", "version": "1.0.0"}
        }
    },
    'initialized': {
        "jsonrpc": "2.0", "method": "notifications/initialized"
    },
    'list_tools': {
        "jsonrpc": "2.0", "id": 2, "method": "tools/list"
    },
    'call_tool': {
        "jsonrpc": "2.0", "id": 3, "method": "tools/call",
        "params": {"name": "get_alerts", "arguments": {"state": "TX"}}
    }
}

process = subprocess.Popen(
    ["uv", "run", "weather.py"],
    stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL
)

try:
    send_mcp_request(process, requests['init'])
    pprint(read_mcp_response(process))

    send_mcp_request(process, requests['initialized'])

    send_mcp_request(process, requests['list_tools'])
    tools = read_mcp_response(process)
    print("Available tools:", [t['name'] for t in tools['result']['tools']])

    send_mcp_request(process, requests['call_tool'])
    result = read_mcp_response(process)
    print("Weather alert received:", len(result['result']['content'][0]['text']), "characters")

finally:
    process.terminate()

The Big Reveal

You just built an MCP client using nothing but Python’s standard library.

No frameworks. No external dependencies. No magic. Just subprocess pipes and JSON—the same tools you’ve had since Python 2.7. Is this “production-ready”? Honestly? Not quite— but it’s surprisingly close. Sometimes, the simplest solutions are the most bulletproof.

This is the dirty secret behind every fancy MCP integration you’ve ever seen. Whether it’s Claude talking to your database, GPT-4 calling your APIs, or some startup’s “revolutionary AI workflow platform”—underneath it all, it’s just this:

Spawn process. Send JSON. Read JSON. Repeat.

It’s like discovering the Wizard of Oz is just a guy with a really good sound system.

Your Next Steps Into Madness

Now that you’ve seen the belly of the beast (in code), you can:

  • Build custom MCP servers that do your bidding (no more waiting for someone else to write the integration)
  • Debug MCP connections like a network necromancer when they inevitably break at the worst possible moment
  • Design better tools by knowing exactly how LLMs consume your metadata
  • Write better tools so irresistibly described that your LLMs fall in love
  • Optimize the hell out of everything because you understand the protocol overhead
  • Or just automate your cat feeder. I don’t judge.

The Beautiful Revelation

The miracle in step 2 of your business plan? It was you, all along.

Go build something weird.


Want to see this code in action? The complete example lives here: [https://gitlab.com/-/snippets/4864350]

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Today's NYT Connections Hints, Answers for June 23, #743
Next Article Elizabeth Hurley joins boyfriend Billy Ray Cyrus as he supports daughter Miley
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Disabled driver shocked after being cuffed when he called 911 for help
News
iPhone Reportedly Moving to All-Screen Design in Two Stages
News
Huawei plans dedicated EV showrooms in retail strategy shift · TechNode
Computing
Digital Grave-Robbing: How AI Is Plundering Online Obituaries
News

You Might also Like

Computing

Huawei plans dedicated EV showrooms in retail strategy shift · TechNode

4 Min Read
Computing

Two Vivo India senior executives arrested in India · TechNode

1 Min Read
Computing

Alibaba appoints six young leaders to oversee key operations at Taobao and Tmall · TechNode

1 Min Read
Computing

China aims to boost the gaming market after new gaming rules hit Tencent and NetEase · TechNode

1 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?