By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Technology Radar and the Reality of AI in Software Development
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Technology Radar and the Reality of AI in Software Development
News

Technology Radar and the Reality of AI in Software Development

News Room
Last updated: 2025/06/13 at 6:02 AM
News Room Published 13 June 2025
Share
SHARE

Transcript

Shane Hastie: Good day folks. This is Shane Hastie, the InfoQ Engineering Culture podcast. Today I get to sit down with Rachel Laycock. Rachel, welcome. Thank you so much for taking the time to talk to us.

Rachel Laycock: Hi, Shane. Thanks for having me.

Shane Hastie: Now, there’s probably a few folks in our audience who don’t know who you are. So let’s start with who’s Rachel.

Introductions [00:56]

Rachel Laycock: So I am the global CTO for Thoughtworks. I’ve been at Thoughtworks for 15 years and I play lots of different technology leadership roles, but my background is as a software developer.

Shane Hastie: In your global CTO role at Thoughtworks, you get to see a lot of what is happening and you’re across many of the trends. One of the things that I know that you are responsible for or certainly deeply involved in is the Thoughtworks Technology Radar. Can you tell us a little bit about how does that come about?

The Technology Radar Process [01:29]

Rachel Laycock: So I am responsible for it, and it’s been running for over 10 years. So my predecessor started it and it started as basically a way for her to understand what was going on. Going to your point of the role, we’ve got 10,000 people across the globe in many different countries and regions. So getting a view of what’s going on and what trends are important was really challenging. And that was when we were probably a third of the size that we are now. And so essentially what we do is twice a year we kind of put a call-out to the Thoughtworkers on the ground of what’s happening, what tools and techniques and platforms and languages and frameworks are you using do you think are interesting? What have you managed to get into production, what have been your experiences with it?

And we mine that from across the globe. And then we get together in person and spend a week basically debating. So we get tech leaders from across the different parts of the globe in different roles, whether they might be head of tech or a regional CTO, or there might be a practice lead and we debate about where these things should go. So they’re in different quadrants. So it could be a language or a framework or a platform or a technique. But the real debate is whether it’s something we’re assessing, whether it’s something that goes into trial or we think it’s something that people should adopt as a default, or if we say hold and you should proceed with caution, which is actually what the hold ring means. People often don’t use it at all, but it just means like, “Hey, we’ve identified some challenges with this, so you should probably proceed with caution”.

And so we spend a week debating it and getting it down to, we try to get around a hundred blips. We use the metaphor of the Radar, so things coming in and out. And then over the course of that week, what kind of themes come up? So what are the topics of discussion is what we end up with our three to five key themes. And so it ends up being a little bit of a trend report, but it’s based completely on our experience. It’s not external research, it’s not peer reviewed except for by the people with deep experience in the room. And people often think it’s a forward-looking report, but it’s just because it’s literally the snapshot and we get it out within a few weeks. We are pretty fast in terms of publishing, but it’s actually a look back. It’s like last six months of technology at Thoughtworks.

Shane Hastie: So what has been most interesting for you in facilitating that?

What Makes the Radar Interesting [03:56]

Rachel Laycock: That’s a really good question. So what’s really interesting is where there’s a hot debate, where something, one region or one team is finding success with something, another has a different opinion or there’s maybe two tools that roughly do the same thing and people have different perspectives on it. That’s when it gets really interesting. The ones where it’s kind of like, yes, everybody thinks that’s a good idea, that’s less interesting when we get into debates is really interesting. But they’re also really hard to blip.

So we have this concept of what we call too complex to blip, where we’re basically like, we’re never getting this in two paragraphs, this whole discussion, so we’re going to have to put out an article or a podcast or something like that. So those basically go into our thought leadership backlog of things that we might write about.

So then you might see them on MartinFowler.com, you might see them on the Thoughtworks podcast, you might see a longer form article on our website that kind of gets into the nitty-gritty of the pros and cons and the nuances that are sometimes involved in discussing especially techniques, but even sometimes tools and languages and frameworks can be hotly debated, which to me is the really interesting part because as a leader, especially in technology, there’s no one tool to rule them all. There’s never, this is the one true answer. It always is, it depends. And those conversations and discussions give me as a technology leader, a deeper understanding of where those, it depends, cases lie, which gives me better tools and insights for sharing with our clients and helping them think about it as well.

Shane Hastie: What’s been the most surprising thing that’s come out of that Radar for you.

Rachel Laycock: I think that the surprising thing that came out of the Radar is the amount of books and key thought leadership that set the tone in the industry. And I’ll use microservices as an example. I remember being in the discussion when that was being discussed and like any new thing, it was very hotly debated. Some people were like, that doesn’t seem like a good idea. Here’s all the problems associated with it because there are lots of challenges. We’re talking about a very complex architecture that requires a lot of skills in the teams to be able to build in that way and run software in that way. And so it was the kind of thing that was hotly debated. And then that, it started off as an article and finally got .com and then became the book. And I myself did plenty of talks on the conference circuit about the pros and cons of microservices and when you should do it and when you shouldn’t.

It was a big side effect that I don’t think anyone planned. It wasn’t like we went into that and be like, we’re going to get together every six months and we’ll produce this Radar and then just assume that books and other great things are going to come off the back of that. And so it was much more organic. First of all, the Radar was only supposed to be an internal publication, but when we started sharing the insights with clients, they were like, “Oh, that’s really helpful”. So then we started publishing externally and then the books followed from that.

Data mesh is another example. I remember that also being discussed in the Radar conversation of another technique, another approach. Again, very hotly debated internally. It wasn’t just the thousands of Thoughtworkers said, yes, this is a great idea. It was like, let’s see some use cases and see how it plays out. And then they eventually become kind of the canonical book. So it’s been exciting to be part of that journey, but it’s surprising. You wouldn’t have expected it.

Shane Hastie: So this is the engineering culture podcast. What is it about the culture of the organization and that group that enables this to happen?

Culture and Organizational Dynamics [07:23]

Rachel Laycock: Well, the Thoughtworks itself, I recently published an article, ’cause obviously everybody’s talking about AI and software right now and how productive we can be, and I pointed out that I don’t think we’ve ever hired anyone at Thoughtworks just because of how fast they can code, because it’s never just been around just coding. It’s always been around attitude, aptitude, integrity. Those were the three kind of, I guess values that we hired for. But there’s also a curiosity to Thoughtworkers, a constant. If you look at our sensible defaults and continuous improvement, continuous learning, curiosity, these types of things, there’s a lot of, I guess statements and things we say at Thoughtworks, like strong opinions, loosely held. And so when you then bring together leaders, especially if they’ve grown up at Thoughtworks, you come into the room and people are not afraid to say what they think and they’re also happy to be told they’re wrong.

And I’ve heard people that have come from other organizations be not used to that at all, where you have to be careful who you say in front of which people, and if you say the wrong thing, it could be a career limiting move. At Thoughtworks, it’s really not like that for better or worse. Very challenging as a leader when everybody’s always challenging you and asking you, “But did you think about this? And have you asked the right question?” But you bring that into the room when we’re discussing technology and you end up with really thoughtful perspectives that have taken into account different opinions and people change their mind throughout the process of that week. Maybe not always, and maybe not on everything, but I do find that quite unique to Thoughtworks. I will say that we’ve helped a lot of clients really like the Radar, and we’ve found that helping them build their own radar for their own organization has been super helpful.

And then as they progress down the path and they’ve done it a few times, they’ll be like, “Well, how do you handle this and how do you handle this?” And I’m like, oh, those are all the challenging exceptions of just dealing with people in a room with lots of different opinions. But this is a great thing. It means you’ve created the culture of people being able to express their opinion and hear the different voices in the room and come to a reasonable conclusion. So I just think that without Thoughtworks kind of culture, I don’t think we would have the Radar.

Shane Hastie: If I want to, and you’ve given us some pointers there, but if I wanted to instill something like that culture in an organization, where do I start?

You Can’t Copy Culture – You Can Encourage It [09:46]

Rachel Laycock: That’s really challenging because it’s hard to change culture of an existing organization. They say if it’s easy to change, it was never part of the culture in the first place. So I think the thing about Thoughtworks is its culture is kind of what I said around attitude, aptitude, integrity, curiosity, continuous improvement, and the fact that the culture was also built around agile, it was agile right from the start, has created that kind of culture. But that’s not to say you can’t introduce some of those into a different organization.

So whenever I’ve been helping clients go on the agile journey, the continuous delivery journey, the microservices journey, the digital transformation journey, all the different journeys that how we’ve constantly renaming things, but it’s often the same kinds of concepts that we bring to the fore. And I always say to clients, you won’t be Thoughtworks at the end of this. And that’s not the intent, right? The intent is that you take some of the best things about us that fit in with your culture and you transform your culture because if you don’t transform your culture into something that’s around continuous improvement and continuously evolving software instead of an old mindset of build, deploy, run, move on, there’s certain aspects to the culture that have to change in order to get into that continuous improvement, continuous evolution mindset. And you can bring those to the fore.

And some of the ceremonies help, although I’m not a fan of certifications that are built around just ceremonies, but they have to have intent. The reason why you do a stand-up every morning is so that you can quickly adjust if people are heading in the wrong direction and everyone has a shared context. The reason why you do retrospectives is so that you actually improve how the team is working and the ways of working for the team. If you just do those things, but you’re not clear on the intent, then you don’t get the value. And so I think when you start to introduce these types of ceremonies that are a part of XP, that are part of agile, that are part of what people have been doing with digital transformation with clear intent, then you can start to bring some of that culture along.

And then of course, another critical piece is the recruiting, as I said, attitude, aptitude, integrity has always been our thing. It was never about, we must hire from these universities and people have to have these things. It was always about who they are and what they brought to the table and what their approach was. And if they were essentially up for continuously learning and adapting, and most of the time we got that right, nobody gets recruiting right a hundred percent of the time, but most of the time we got that right and we were able to continue to grow the culture that we wanted.

Shane Hastie: Shifting tack a tiny bit, and you did touch on it when we were talking about the Radar, the efficiency focus that seems to be so prevalent today with generative AI. We’re going to bring in the co-pilots, we’re going to generate huge amounts of code and we’re going to be so much more efficient. I don’t see that really happening, do you?

AI Efficiency Hype and Reality [12:41]

Rachel Laycock: No, I don’t, to be honest, to be totally blunt. And even when we are more efficient, people will build that in by default and will no longer be more efficient. I’ll tell you what I mean by that. So let’s say you measure efficiency in your organization through velocity, or how long does it take you to do so many story points? Well, you can almost do a kind of from them to there at this point in time, but once people start adopting those tools, they’re going to estimate those story points and that velocity based on the tools that they’re using.

What I’ve noticed is this is not like the agile movement, which was from the development teams driven by the engineers of recognizing that this waterfall approach was not helping us in many of the cases in terms of building software. And so if we take XP practices of pair programming to test-driven development, continuous integration, these kinds of things, and then some of the things I talked about earlier, like stand-ups and retrospective, that’s going to help us move fast as well as have high quality resilient features out the door.

But it was driven by engineers and by software development teams, not just engineers, also project managers and other folks that were part of the development teams. This focus on efficiency comes from the top down. And most of the technology leaders that I speak to are like, “My board’s putting pressure on me to measure efficiency and then tell me how much faster I am”. And I’ve been hearing this for a year and a half now, where they’re coming to me and saying, “What’s your efficiency metric? How are you measuring it?” And it’s notoriously hard to measure, by the way, for more reasons than I can even name here. But it’s also the wrong focus because there’s the issue of building high quality products at speed, at scale that are resilient in production is never been how much faster can I write code? That’s never been the problem.

The problem is often the legacy systems that are hampering their ability to move forward, it could be some processes and ways of working that are hampering their ability to move forward. It could be alongside the legacy. It’s like they don’t have the right continuous integration and continuous delivery and deployment pipelines in place and they don’t the right testing in place. And these are problems that I’ve seen time and time again in organizations that are the real barriers to them moving effectively and achieving results effectively. And honestly, at the end of the day, these tools, whether it’s code generation or coding assistance, they amplify indiscriminately. So you can write code faster, but it doesn’t mean that it’s high quality code, not if you don’t have the right guardrails in place. And so you could actually create more problems, right?

It’s like, okay, now I can write twice as much code. And it’s like, cool. Now you’ve got twice as much technical debt than you had before. And what was your biggest problem before in terms of being able to move quickly? Oh, it was technical debt. It wasn’t actually writing features faster.

And so I’m hopeful that as an industry, we’ll kind of move away from board-driven development as I’ve started calling it and back into, okay, let’s get these tools into the hands of the engineers and into the people that are part of the product software development life cycle. And then let’s see what great things they can do with them to solve some of the really intractable problems in software around technical debt, around legacy modernization, around running existing systems, around making systems more resilient instead of the hyper focus on let’s just build more and build it faster.

But I have a strong opinion on that. I’m happy for it to be weakly held and somebody to prove me wrong, but it hasn’t happened yet and it’s been two years.

Shane Hastie: If we do get these tools in the hands of the right people for the right reasons, what are some of the potential outcomes that you can see happening there?

Practical AI Applications [16:25]

Rachel Laycock: Well, one of the things I saw really early on when we gave some of our developers access to these tools that we’re dealing with some really challenging problems in modernizing legacy systems is the ability to use these tools alongside other techniques to do code comprehension. So to understand code bases that you can only understand with an SME. And I’m looking at things like mainframes and COBOL, but that’s not the only ones. There’s plenty of other code bases written in all kinds of languages that very few people in an organization really understand or really have context of. They were written in a time, maybe there wasn’t great documentation, there wasn’t much testing. They require that SME. And we saw people immediately starting to see results of just being able to comprehend and interrogate what a code base was doing. And I did a video on this at YOW! last year, so you can find that and Google it, but I talk about what was the techniques that we used to do that. So that was one.

There’s another organization that we just started partnering with called Mechanical Orchard, which founded by the people that founded Pivotal Labs, again, big proponents of XP practices. And they’ve started to use generative AI not only to understand existing code bases, but to actually transform them from old style code bases into new style code bases. And I’m not talking about just moving it from COBOL to Java, and it still looks like COBOL and it’s affectionately called JOBOL, but I’m talking about really being able to build out the test harness and then transform the code and then check that it’s performing at the other end. And so there’s some really interesting stuff going on there as well. And then I think what’s also an important factor is when you get these tools in the hands of really experienced developers, they can test the edges of what these things can do really well and where the gaps still are.

AI Coding Mirroring the Microservices Intent [18:16]

And I’ll use the example of when we first were introducing the concept of microservices, one of the early concepts was, these very modular small pieces, if you build it in such a way that it’s small enough that you can easily comprehend it, then when you want to make changes to it, you don’t change it, you just rebuild it, which I don’t think anyone ever really did because there’s still effort involved in doing that. But let’s say you did have a really nicely architected modular system and you’ve built great test harnesses, and that’s all in a pipeline. Well, maybe with Generator you could rebuild small modular components quite easily. So that’s where I think it starts to get interesting is like, what can we do with the code base based on the current state of that code base? If it’s well architected and very modular, you could probably do different things with it versus it being a legacy code base. How can we take a legacy code base and turn it into something that’s well architected and modular?

But I think what will be really important will be how we specify the software and how we verify it. And so organizations that have gone to the effort of having really strong guardrails and really good verification in their systems with continuous deployment and continuous delivery, I think are going to be able to do more interesting things with these tools earlier than those that are not in that state. And so I’ve not exactly predicted exactly where it’s going, but I think those are the things that we’re exploring. And I think those are the things that start to get interesting when you put it in the hands of the people who are really solving problems day in and day out in software.

Shane Hastie: Will I one day be able to take my monolith and drop it into a funnel and out of the end comes full microservices?

The Monolith to Microservices Question [19:58]

Rachel Laycock: One day, maybe not in the near term. In the near term, these tools can help you do that, but they’re not going to do it for you. It’s not insert code base here out pops well-architected modular architecture. It’s going to take still a lot of humans in the loop along the way. And that’s probably a good thing. ‘Cause I think that the hype around the end of the software engineer I think is greatly overestimated right now. But I do think the role will change and the kinds of things that you do day in and day out could change based on the tools, but that’s always been true. Once we started using IntelliJ and IDEs, we typed a lot less, we’re probably going to be typing even less, but the understanding of the architecture of the system and getting that right for how you want the thing to run in production, that still requires real depth of experience. And I don’t think that’s going anywhere anytime soon.

All the POCs and all the hype I see we’re talking about like, “Oh, look, I can build this app in five minutes and it used to take me days”. And I’m like, yes, but it’s a single app. It’s like, that’s not what most of us are doing. When we’re building scaled enterprise software, we’re not just building one random app. That’s really not the problem we have to solve. It’s fun and it’s cool, great side project, and I’m all for vibe coding, building my own little apps at home, but in production guardrails are required.

Shane Hastie: So one of the things that sits in my mind is these tools are really good for experienced engineers. How do we build that experience?

Building Developer Experience [21:34]

Rachel Laycock: That’s a great question. It’s going to get a lot more challenging. Right now, the way we build that experience is we have what we call leverage across a product team. So you have experienced people, you have some mid-level experienced people, and then you’ll have some fresh out of university or one to three years in their role and mixing them together, you get that kind of mentoring situation where they learn from each other. And then obviously most of us learned from the first time we put something in production and it went wrong, is that it’s usually the hard lesson that really teaches you about the importance of good testing and feature toggles and all of this good stuff. And we do a lot of that. At least since I’ve been in the industry, we’ve been using IDEs more or less. And so yes, a lot of it is auto complete, but most of the debugging and everything you had to figure out yourself.

Now, if you introduce tools that are helping you do the debugging or they’re helping you fix the tickets, to me, that’s where a lot of the learning is often is when things go wrong. If it always goes right, then you only ever learn the happy path. And that is the thing that’s puzzling to me is that if we get to a stage where we’re able to build more software and we’re able to rebuild more software faster and more effectively, then less people will be running bigger systems because there’ll always be more software to build. Nobody’s run out of ideas of products, things that they want to build, but it brings up the question of like, well, how do you grow really deep expertise in folks that, they’re at higher levels of abstraction. So when something goes wrong, if the AI can’t help them, how are they going to figure it out?

And that’s kind of a puzzle to me. And I think there’s various different tests we’re running in terms of different shapes and team sizes that you can leverage with different tools, but I don’t have an answer to that, what the shape of a team will look like and how we’ll grow experience in engineers in the future when it changes dramatically. And I’m sure that the industry faced this problem when we moved into the layer of abstraction where we longer had to worry so much about the performance of the machine, and a lot of that was taken care of. People were like, “Well, who’s going to care about that? And what if something goes wrong?”

And in the end it’s like we track, we have the kinds of verification in place that tells us if the performance is not going well, and then we’ll go and in there and debug stuff. And we do seem to figure it out. But yes, it’s an open question for me. I’m sure we’ll get to a place in the industry where we figure out how to create new career paths for people, but there’s a lot of unknowns right now, and that in itself just generates a lot of risk and fear.

Shane Hastie: Risk and fear and the turbulence that we’ve seen in the industry. And as a result of that massive disengagement and all of those things, how do we shift as an industry? What are the things we can do to get better?

Addressing Workforce Concerns and Hype-cycle Effects [24:34]

Rachel Laycock: Yes, it’s a good point. I think the challenge with all this hype and all this noise around, oh, we won’t need software engineers anymore, the agent’s going to do everything, is it does disengage the workforce. And in fact, what I predict is we’re still going to need engineers that really understand systems. It’s just that they’ll probably have coverage over more systems because a lot more of it can be automated, which is great, but we still need them in those roles. I don’t see them going anywhere anytime soon. And I am worried that with all the hype, I mean the technologists are getting disengaged. It’s hard to get people excited and say like, “Hey, use this tool and see what we can do with it”, if they’re being told, “Use this tool and you won’t have a job in five years”.

So I guess I’m just hoping that as an industry, we get over this hype cycle and I’m starting to see signs of it in the news, but we’ll see, where the models settle down a little bit. The tools settle down a little bit, and then it’s more incremental, the change. And then we’ll start to know, okay, well, how do things change with these models and with these tools?

But it certainly, for inside Thoughtworks, what I’ve been saying is it’s not me, the CTO or the technology leadership that’s doing this to you, this is happening in the industry. And I recognize this is coming from top down. It’s not you guys saying, “Hey, we found these cool tools”, which is kind of the Radar, going back to the earlier conversation. The Radar helps us identify what kinds of things we wanted to tell our clients to use because our teams were like, “These tools are so much better. Can we get clients to use them?” But I’m trying to engage people with I believe that we’re still going to need deep technology professionals, and I want to help everybody at Thoughtworks learn to be what the new version of that is.

Now, I could be wrong. But so could the hundred other 200, 300 other people spousing different perspectives out in the world? No one knows exactly how things are going to turn out. But I do think it’s really important as technology leaders to try and figure out how to engage people and get them excited about this. Because if they don’t, it’s just going to keep coming from the outside in, from the board, from organizations that are incentivized to make a lot of noise about this. And I believe we’re going to need deep technologists, especially if they’re covering more systems in the future. And so I don’t want people to get disinterested in the industry or look to go into another field or decide not even to join. And that’s what worries me the most actually right now, is that people coming out of university is like, “Oh, well, there’s no point. Everybody’s saying that this won’t be a job in two years”. I just don’t buy that. And I think that’s really problematic.

Shane Hastie: How do I tell my granddaughter that there’s a good career in technology today?

Technology Careers Remain Viable Despite AI Advances [27:25]

Rachel Laycock: Well, my view is technology is not going anywhere. It’s like the machine right now can’t do complex design. It doesn’t know intent. It doesn’t know why the software was built, why it was. It can tell you what it’s doing, but it doesn’t know why. And so that’s where humans fit in. Good design still comes from humans. And at the end of the day, these large language models are just built off past knowledge. There’s still a lot of creativity that goes into software. And I think it’s this constant thinking about software in terms of we’re building bridges. We lean into these engineering metaphors, and I just wish that they would die actually, because it’s just as much like an art and a craft that it is a science and it’s evolved so much because there’s so much creativity that humans can bring to it.

And so as I said earlier, will the day-to-day tasks of a software developer change? For sure, but is technology going anywhere? Is the machine going to build it and run it all for us? Not anytime soon, I believe. And so I do think that there’s still plenty of roles in the technology industry for all of us. But predicting what’s going to happen in 10 years, that’s much more challenging.

Shane Hastie: Rachel, really interesting conversation. Thank you so much for taking the time to talk to us today. If people want to continue the conversation, where do they find you?

Rachel Laycock: Oh, that’s easy. I’m on LinkedIn. You can just search for me and ping me and I’m happy to talk.

Shane Hastie: Thank you so much.

Rachel Laycock: All right. Nice to meet you, Shane. Thank you.

Mentioned:

.
From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Qualcomm’s third-gen Snapdragon 8 Elite to reportedly come in two versions amid soaring TSMC 2nm costs · TechNode
Next Article What Composable Marketing Looks Like in Real Life
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

5 misinformation trends that circulate around mass protests
News
Tencent launches and open-sources Hunyuan video-generation model · TechNode
Computing
iOS 26 Adds Support for Transferring an eSIM to and From Android
News
Researchers invented a device that pulls drinking water from the air without electricity
News

You Might also Like

News

5 misinformation trends that circulate around mass protests

8 Min Read
News

iOS 26 Adds Support for Transferring an eSIM to and From Android

7 Min Read
News

Researchers invented a device that pulls drinking water from the air without electricity

3 Min Read
News

I’ve been using Google Photos for 10 years and this is what I want to see in the next 10 years

12 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?