AI Archives | TechWire Asia https://techwireasia.com/podcast_categories/ai/ Where technology and business intersect Thu, 20 Feb 2025 09:40:44 +0000 en-GB hourly 1 https://techwireasia.com/wp-content/uploads/2025/02/cropped-TECHWIREASIA_LOGO_CMYK_GREY-scaled1-32x32.png AI Archives | TechWire Asia https://techwireasia.com/podcast_categories/ai/ 32 32 Practical uses of AI in the enterprise business: Singapore, Nov 2, 2023 https://techwireasia.com/podcast/business-events-singapore-ai-artificial-intelligence-in-enterprise/ Fri, 08 Sep 2023 06:07:06 +0000 https://techwireasia.com/?post_type=podcast&p=232880 We talk about the event where the business decision-makers of the APAC region can learn about AI on established platforms.

The post Practical uses of AI in the enterprise business: Singapore, Nov 2, 2023 appeared first on TechWire Asia.

]]>

Show Notes for Series 03 Episode 48
This podcast is produced in conjunction with BMC Software.

Getting hands-on with some of the most practical and impactful technology available today is what’s on offer from BMC Software’s “BMC Connect Singapore” event, November 2nd 2023.

Ahead of the event, we discuss with Brenton Smith and Gunal Kannan from BMC what’s on the agenda, who the event’s designed for, and what attendees might learn and experience.

The challenge facing many organisations in the APAC isn’t the understanding of or realisation of the power of machine learning and artificial intelligence; rather, it’s how the technology can be put into effect to give the best business outcomes. It’s here that a company like BMC helps.

With use-case examples, keynote and inspirational speakers, workshops and demonstrations, join BMC, its customers and prospects, for a day exploring the possibilities that machine learning offers business operations.

Gunal Kannan is from the technical side of the tracks, while Brenton is more business-oriented. Together, in conversation with our host, Joe Green, we go through both the detail of the event (and others on the company’s ‘world tour’) and discuss many of the issues surrounding the emerging technology of AI in the workplace.

REGISTER HERE

Learn more about BMC Connect, Singapore, November 2nd, 2023:
https://bmcconnect.bmc.com/connectsingapore/

Worldwide events where you can pick the team’s brains:
https://www.bmc.com/events.html

Brenton Smith, VP Software, Asia Pacific and Japan’s LinkedIn profile is here:
https://sg.linkedin.com/in/smithbrenton

Gunal Kannan, Associate VP, Technology Strategy and Advocacy’s page is here:
https://sg.linkedin.com/in/gunalkanna

Joe Green is here:
https://uk.linkedin.com/in/josephedwardgreen

The post Practical uses of AI in the enterprise business: Singapore, Nov 2, 2023 appeared first on TechWire Asia.

]]>
Getting the Edge on Data Center Efficiency https://techwireasia.com/podcast/dell-hpc-ai-ml-poweredge-server-new-podcast/ Tue, 07 Feb 2023 19:31:40 +0000 https://techwireasia.com/?post_type=podcast&p=225786 In this episode we're looking at Dell's PowerEdge server range, the hardware designed for cloud, edge, and high-performance workloads like AI and machine learning.

The post Getting the Edge on Data Center Efficiency appeared first on TechWire Asia.

]]>

Show Notes for Series 03 Episode 29
This podcast is produced in conjunction with Dell Technologies

As the world digitizes at breakneck speed, companies and organizations are under pressure not only to produce products & services of high quality but also to do so efficiently and in ecologically-sound ways.

The new Dell PowerEdge server range produces the type of compute built for accelerating digital transformation but does it power-efficiently. Even the manufacturing and design of the range have been considered for their ecological impact, and the new PowerEdge portfolio offers impressive power consumption to processing ratios.

On this episode of Tech Means Business podcast, we talk to Andrew Underwood, the Field CTO for Data Centre Solutions at Dell Technologies, Asia Pacific and Japan, about the ethos behind the range’s design, AI and ML workloads, and the balance today’s enterprises have to achieve between service quality and sustainability concerns.

Edge or Cloud deployment, test environments or full production, Dell PowerEdge is the server for a new generation of efficient and cost-effective data centers.

See the PowerEdge announcement:
https://www.dell.com/nxtgen-dell-powedge-server

Sign up for a day’s bespoke workshop tailored to your business and its C-footprint targets:
https://www.dell.com/customer-solution-center

Learn more about the PowerEdge range from here:
https://www.dell.com/shop-poweredge-server

Andrew Underwood is on LinkedIn here:
https://www.linkedin.com/in/andrewjamesunderwood

Joe Green can be found here:
https://www.linkedin.com/in/josephedwardgreen/

The post Getting the Edge on Data Center Efficiency appeared first on TechWire Asia.

]]>
Decision Intelligence is machine learning for business https://techwireasia.com/podcast/artificial-intelligence-business-decisions-practical-peak-ai-podcast/ Mon, 07 Nov 2022 20:15:18 +0000 https://techwireasia.com/?post_type=podcast&p=223176 Can artificial intelligence work faster in business settings? Decision Intelligence is said to be the answer: where ML meets BI.

The post Decision Intelligence is machine learning for business appeared first on TechWire Asia.

]]>

Show Notes for Series 03 Episode 21

The next generation of AI in the workplace is decision intelligence: the practical application of machine language in commercial settings. Applying AI in the workplace has, to date, being a slow process and one that involves a lot of resources. But that’s all about to change.

Peak AI is the company that’s driving decision intelligence in organizations, turning the typical 18 month wait between query and results into a matter of weeks or even days. Plus, there’s no need to hire more data scientists and invest in ranks of super-expensive graphics processors.

The Peak AI offering is cloud-based, and while no ML solution is plug-and-play and operate-able by anyone, Peak AI’s platform takes away the heavy lifting that’s usually placed on data teams. In addition, it actively reduces the gap between the data science function (who may know little about business strategy) and pure-play business functions (who, in turn, know little about data analysis).

Peak AI’s Decision Intelligence Index is drawn from its most recent survey of thousands of businesses that have begun an AI journey. We discuss the Index, the state of ML in the enterprise, and best ways forward to turn the “new oil” that’s data into insights that will have positive impacts in the workplace.

Learn more about Peak AI here:
https://peak.ai/

Ira Dubinsky, Head of Go-To-Market Strategy at Peak AI is here:
https://www.linkedin.com/in/iradubinsky/

Your host, Joe Green lives online here:
https://www.linkedin.com/in/josephedwardgreen/

The post Decision Intelligence is machine learning for business appeared first on TechWire Asia.

]]>
Revealing cyber threats with AI and Exabeam https://techwireasia.com/podcast/exabeam-ml-ai-threat-detection-antimalware-cybersecurity-podcast-s03-e03/ Thu, 14 Apr 2022 15:32:00 +0000 https://techwireasia.com/?post_type=podcast&p=217721 Using advanced machine learning to address cyber security problems from a business's standpoint.

The post Revealing cyber threats with AI and Exabeam appeared first on TechWire Asia.

]]>

Show Notes for Series 03 Episode 03

This podcast is produced in association with Exabeam.

A vital part of any organization’s cybersecurity arsenal is a SIEM system (security information and event management). Typically, SIEMs base their activities around logfiles, which by definition represent historic data. But today’s latest generation SIEM platforms can act on information pretty much as those files are written, and foremost among these systems is Exabeam.

On this episode of the Tech Means Business podcast, we speak to Bob Reny (CTO & Principal Systems Engineer) and Gareth Cox (VP Sales for Asia Pacific & Japan) from Exabeam about the company’s methodology, and how the machine learning engine at the platform’s heart works and protects.

Exabeam’s focus is very much on directed business-oriented outcomes, so it can be deployed on the basis of, for instance, “help us stop phishing attacks,” or “flag up potential insider threats.” By using advanced, self-learning algorithms, companies can be alerted to any type of anomalous activity that indicates illicit activity, even on very complex networks.

And because the learning corpus for the platform is the network it’s installed on, it improves over time for situations specific to each deployment. Up and using its smarts in weeks not months, Exabeam represents cybersecurity’s new frontier, where hackers too are leveraging AI to penetrate and destroy priceless commercial IP.

Here are our guests’ LinkedIn profiles:

https://www.linkedin.com/in/bob-reny-b6386227/
https://www.linkedin.com/in/garethcox/

You can join the company-run CTF here:
https://www.exabeam.com/tp/exabeam-ctf/

Get a demo of Exabeam here:
https://www.exabeam.com/contact/get-a-demo/

Soon to add l33t h4kk0r to his bio, the podcast’s host Joe Green is here:
https://www.linkedin.com/in/josephedwardgreen/

The post Revealing cyber threats with AI and Exabeam appeared first on TechWire Asia.

]]>
Putting AI into IT Operations in 2021 https://techwireasia.com/podcast/ai-it-operations-artificial-intelligence-provisioning-development-network-compute-podcast-02e15/ Mon, 22 Feb 2021 11:44:06 +0000 https://techwireasia.com/?post_type=podcast&p=207644 In conversation with Jayanti Murty of Digitate about selling AI into IT teams.

The post Putting AI into IT Operations in 2021 appeared first on TechWire Asia.

]]>

Show Notes for Series 02 Episode 15

IT professionals have thick skins that deflects most marketing, especially when it comes to AI. After all, your phone apparently uses AI to frame the perfect picture (really?) and your TV knows and adapts to your taste in viewing thanks to “advanced machine learning” (nope).

So selling AI into ITops must be an uphill struggle. But with the right message and — most importantly — demonstrable use-cases and proof, that’s just what Digitate does. Jayanti Murti talks to us about perceptions of AI, the verticals most ready to adopt, and how machine-speed iterative algorithms actually improve the daily lives of IT staff.

Digitate launched in 2015, and was built by software engineers and more PhDs than you ever expect to see in a single room at any one time. A platform that’s “made its bones”, then, but we still have questions!

Jayanti on LinkedIn can be found here:

https://www.linkedin.com/in/jayantivsnmurty/

Joe’s page is here:

https://www.linkedin.com/in/josephedwardgreen/

The post Putting AI into IT Operations in 2021 appeared first on TechWire Asia.

]]>
ARMing the low-power data center https://techwireasia.com/podcast/arm-data-center-chips-x86-aws-silicon-podcast-s02-e11/ Wed, 02 Dec 2020 14:42:30 +0000 https://techwireasia.com/?post_type=podcast&p=206500 In conversation with Chris Bergey, VP of Infrastructure Line of Business at ARM Holdings, with whom Joe discusses the ARM future in data centers, routers, washing machines and maybe even a computer or two.

The post ARMing the low-power data center appeared first on TechWire Asia.

]]>

Show Notes for Series 02 Episode 11

Buy a computer chip with some serious “grunt” and you go for Intel, right? Or AMD, perhaps more likely, these days. In whichever case, it’s x86 architecture. That’s what data centers and clouds run on, after all — these are the serious computers for grown-ups.

The little voice of dissent you hear, however, hails from Cambridge, UK, a little company called ARM, recently acquired by NVIDIA for many billions of dollars. The voice is telling you not only are its chip designs as performant as those power-hungry x86 chips, but they run cooler.

It’s (nearly) just a case of recompiling your applications, and like magic, your cloud bill just fell by 30% overnight. Your power consumption and carbon footprint just shrank, too, and maybe your machine-learning algorithms got a boost. All in all, you look like a better IT professional.

Even Apple’s in on the act, although it’s not passing on the savings it makes by not having to buy Intel chips straight onto consumers. But nevertheless, Apple now makes its own ARM chips (or rather, it uses “Apple silicon”).

With a unique licensing model, anyone can make their own ARM processors or variant thereon; and in fact, as Chris Bergey of ARM says, the more the merrier! This is a guy who looks forward to reading his Twitter feed as delighted comments flow through his timeline.

There will soon be 200 billion ARM chips in the wild. What’s the fuss about? What will it mean to your business? Will every enterprise buy hundreds of Mac Minis? Joe and Chris enthuse together in this podcast.

Chris Bergey on LinkedIn:
https://www.linkedin.com/in/chrisbergey/

And Joe’s LinkedIn is here:
https://www.linkedin.com/in/josephedwardgreen/

 

Full transcript available.
[showhide type=”transcript” more_text=”Click to read.” less_text=”Click to hide” hidden=”yes”]

Joe Green (host): Welcome to the Tech Means Business podcast. This is a series of conversations that I like to have with interesting people in the worlds of technology and also of business and hopefully where those two areas of industry come together.

This week, I’m absolutely delighted to be joined by Chris Bergey of ARM or ARM Holdings, I guess. They’ve been acquired for an enormous amount of money recently by Nvidia, or at least a decent portion of the company has. And so it’s a good time to sort of catch on to that wave of interest and maybe talk about all things ARM and, in this case, well, let’s see where it takes us.

Now, as you can probably tell by my dulcet tones, I’m an Englishman, terribly proud of it. And I remember back in the 80s, of a little town called Cambridge, which is probably most famous for its university. But back then, it also became known for a few technology companies that were born there; ARM is one of them. And, of course, Sinclair was another one. And the Sinclair’s ZX81 was the first computer I had access to: interesting fact there. So Chris, welcome to the podcast. It’s a real pleasure having you on the Tech Means Business podcast.

Chris Bergey (guest): Joe, it’s my pleasure. It’s really exciting to be here today.

Joe Green (host): So Chris, ARM’s in the news at the moment for all sorts of good reasons, really. And I’m personally incredibly excited about the prospects of now and the future, scaling the global heights, I guess. You must be absolutely thrilled with the progress that ARM has made over the last few years?

Chris Bergey (guest): Yeah, it’s been an amazing ride. And I’ve obviously not been there for a lot of it; we’ve actually just celebrated our 30th anniversary of being founded in what we like to call it…we like to refer to as a turkey barn! It was some of the early offices that the team worked out of. And it has been an amazing ride.

And I think one of my favorite statistics is over that 30 years, it took us almost 26 years for ARM processors to ship basically 100 billion devices. And we are going to actually hit our next 100 billion in four years. So going from what took 26 years to get, just in four years accomplishing again, gives you an idea of the trajectory and how pervasive ARM has become in many of the electronics areas.

Joe Green (host): I think it’s very easy, isn’t it, to become almost blasé about the extent of technology like this? Just looking around the room now, I’m looking at an audio amp, and it’s got a glowing display on it, which probably has an ARM processor in there to show that display. Down the hallway, there’s a washing machine and the TV and all these things who’ve got ARM processors in them. Is that part of the marketing schtick? From that, it’s almost like talking about Linux, isn’t it? “It’s everywhere, but you just don’t realize it”?

Chris Bergey (guest): Well, I mean, I think we sometimes do I guess refer to that, but I think it’s really just about the evolution of semiconductors and the world that we’re living in today, as you highlight that, everything has a microcontroller in it or something that scales a lot higher than a microcontroller when you think about the smartphone in your pocket or whatever.

And so ARM is really written on those waves, right. And I think that the two biggest waves that ARM has written are the things or what we call IoT today. So all of the devices and how we vote, we added microcontrollers or intelligence to all these devices, and now we’re adding connectivity, whether that be… I think I have a connected instapot or, my washer or dryer or my sprinkler system actually has Wi-Fi in it, right, something I would never have thought of. I work quite a bit in Wi-Fi and, as that boom was happening, but and then also the smartphone wave.

And, and I think that’s really these computing waves that occur. And it’s really those billions of devices that have really helped to mature and make the ARM architecture as widespread as it is. And it’s like you said, it’s hard to find a device that doesn’t have some amount of ARM technology in it today.

Joe Green (host): Absolutely. And I mentioned some of those hidden devices or at least smaller devices, which are pretty much everywhere. Is that going to be a challenge do you think for ARM? How are you going to break out of that mold as If we power small devices, how are you going to make that transition? From powering the little things to, for instance, powering data centers?

Chris Bergey (guest): Well, it’s you’re absolutely right that those are not the areas that ARM is most thought of. But, we actually started over ten years ago in seriously making investments and having a desire to participate in those markets and participate in those markets in a meaningful way.

One of the examples I would give is, over ten years ago, ARM started to work with some national agencies and really looked at what it would take for supercomputers to be built on ARM, and I’m sure most of your audience is familiar with the supercomputer space. But it’s, it’s gotten some extra noise these days with COVID testing and some of the modeling and things like that, that we need these supercomputers to do.

And so we’ve been on a long journey, a ten-year journey to try to achieve that. And we’re actually very proud that this year, actually the world’s number one supercomputer, they keep a list of the Top 500 supercomputers. And number one this year by a large margin by 2.8 times faster than the second number two is based on ARM, and that’s the Fugaku system in Japan that that has been built there by the Rakin group, the record labs. So, I think that it, yes, it’s been something we aspired to do. It’s been a journey. We’ve had some ups and downs. But we believe that it is happening today. And we think that it’s an exciting future going forward in data centers for ARM.

Joe Green (host): Perhaps people aren’t aware of the importance of data centers, I mean, to us, we just pick up our phone, and we tap on a screen, and stuff happens in 95% of the cases, of course. Actually, what’s happening is happening distantly in these huge data centers. So obviously, it’s a wildly important market. Why is ARM technology so well suited to the data center environment? In your opinion? Obviously, the route one answer is that it’s low power and therefore low heat. But is there more to it than that?

Chris Bergey (guest): Well, it’s, it’s funny, you bring that up, because it’s very similar, actually, to this ten-year journey, I was just talking about where, I think, when ARM aspired to get into infrastructure, and to get into data centers.

Front and center was the pitch you just said of, hey, we’re low power, hey, we know it’s gonna be great, right? It’s gonna be great. And, quite honestly, it wasn’t; it wasn’t met with a lot of excitement from the operators. And, of course, the cloud wasn’t what it is today, and the concentrations you talked about, but it really was, Hey, we’re plugged into the wall, we’re not battery powered, what we really care about is performance. And we need to really run these workloads.

And so that was really what this in the ten-year journey I’ve been mentioning, was really to go attack those two things, I think we felt like, we need to do a lot of work on the software side to make sure that the software ecosystem and the types of software workloads that the cloud providers care about that we could work well in those environments. And of course, and on the other side, we need to have a very competitive processor core. And that was something that we knew we could get there. It was, it was a lot of it was just, some dollars and some focus. So that’s really what we focused on.

And I think if you look at our penetration or some of our early success, most recently, it’s the fact that we have closed the performance gap. And people are seeing on cloud-native workloads or cloud software that basically Hey, we’re getting as good or similar performance as we would get from the leading alternative processors, in the x86 world.

Ironically, once we hit that performance threshold, all of a sudden, now people are saying, Hey, what, that power thing is cool. And something we really find valuable. Because what, as you mentioned, these data centers and they are just amazing, multiple football field type of size, and I’m talking about English football, not American football. But, uh, they’re just enormous and, and one of the biggest constraints they actually have is how big of a power station can they build to feed these beasts, right and, and really what you do is you start breaking that down to, you maybe have 500 megawatts or just some huge enormous amount of power you’re providing. And of course, that gets broken down into, how much power can you deliver to each rack? Well, if that’s your fixed metric, and if because of the power density of ARM, you can offer 3000 virtual CPUs versus, let’s say, 1000 virtual CPUs, that’s a big deal, because that’s, obviously, three X amount of computing, and maybe three X amount of revenue dollars. It really changes the economies of scale.

And so that’s one of the biggest things that is really getting exciting for us is that we’ve closed the performance gap, and we’ve closed the software ecosystem. And now people are able to take advantage of the innate power advantage that we have to even increase the density.

Joe Green (host): Chris, you’ll have to forgive me if this is a terribly ignorant question. Is there something to be said for current “fashionable” development methods, I’m thinking about microservices and containers? Is that type of development more suited to ARM than more monolithic x86 development environments? Or is my opinion just one that’s coming from ignorance and hearsay?

Chris Bergey (guest): No, no, no, you’ve absolutely hit it on the head. So, if you look back for the last, say, 10, maybe 10-15 years, there’s been a push for what people would call cloud-native software development. And it’s actually not just a set of languages, but it’s also a set of a methodology of using something called continuous integration, continuous development tools. And the idea is, is that, you’re basically creating the software that is abstracted from hardware, right? So again, and of course, you can even go to function as a service or software as a service, you’re highlighting, but even cloud-native really thinks about things like containerization.

And there is this abstraction from the instruction set, or at least there is this ability, for example, in a CI-CD environment, continuous integration, continuous development, to when you check in your code, and you do a build every night, you can actually build it on ARM as much as you build on an x86.

And so as we’ve made our investments, we became this first-class citizen in that world. And so yes, so cloud-native software, it’s pretty quick for you to be able to compile on ARM just like you would have, compiled on x86. So cloud-native software, we estimate, is about 50% of the workloads in the cloud today. And it’s clearly the dominant growth factor if you look at workloads going forward. So Cloud Native software has been a big enabler for ARM. And as one of the things that is really helping us make some significant inroads,

Joe Green (host): I’ve now got this awful image in my head, very sweaty Steve Ballmer, back in the day running up and down stage screaming, developers, developers, developers. Is that not the key to it really, I mean, are we gonna be seeing a lot of rabble-rousing a lot of running up and down stage on behalf of ARM? Is it simple? Just recompile, and applications run on these chips that don’t take as much power as x86 chips?

Chris Bergey (guest): Yes, I feel like we need to do that. But what’s really exciting is that we have some of the biggest industry heavyweights doing that for us. So, we have some of the industry leaders driving that story, and one of the more vocal that I can talk about is Amazon and AWS. Last year at Re: invent their big conference, the keynote, one of the key messages was their commitment to ARM and how they believe that their Graviton and Graviton 2 offering was going to change the future of cloud computing. And that is really based on our Neoverse IP. And what they were promising to customers was basically a 40% advantage in price performance versus the fifth generation of x86 instances that they had offered.

So they they are out there, promoting that as well. As we have in China, we’ve got quite a bit of traction as well and, and there are other cloud guys that are making some inroads too, but what’s exciting is what you will actually see, and it’s actually exciting for me, just to log on to my Twitter feed.

Honestly, every morning is the excitement that’s building around that ecosystem. So literally, I go on my Twitter, and it’s different companies of, Hey, I moved to Graviton 2, and, only took us a week, and hey, we’re seeing 20% performance gains, and, I reduced my Amazon bill by 30%. And, it’s just unbelievable feedback in real-time.

And so it’s really a flywheel that’s happening at this point in time where, I think, there was a perception that, hey, going to ARM was going to be difficult. And so as you see, this ecosystem of tech leaders that are saying, hey, I’ve done it, and it’s not that hard, and I’m getting a great advantage.

It’s really starting to get great traction, and really, the whole ecosystem is taking a look at that. And, as you mentioned, cloud computing is an awesome power. And, I think what a lot of us are going through this work from home. And I think we are amazed with the way infrastructure has been able to adapt, right, to be able to [change our] workloads [to change] so quickly. But, and that’s the beauty of the cloud.

The analogy I give for cloud is that is it’s like your credit card, right? It’s great that you have all this extra spending, and you can get all this extra compute. But the challenge is, you get the bill at the end of the month, and it catches up with you.

And so for someone like Amazon to be able to offer a 40% price performance offering. Man, I mean, for the CFOs, CEOs, CIOs, cloud spending is top of mind. And if it’s portable, if it’s something that they can go and get that discount without having to renegotiate, or just off the top and have it available worldwide from Amazon [which] has over eight locations worldwide, that they’re offering their ARM servers.

It is a game-changer. So I would say yes, we’re screaming and we’re Yes, we’re jumping up and down. But the awesome thing is this is ARM, and it’s about an ecosystem. We have so many more people jumping up and down and shouting for us. And it is just it! As I said, I feel like it’s Christmas morning every day when I get to look at my choice of server. Yeah, it’s great. It’s really fun.

Joe Green (host): At present, Amazon is clearly leading the field in terms of market share for cloud computing. And over the next few years, obviously, ARM is going to be developing into data centers. Do you think that there’ll be a way? I’m just trying to think of the best way to put it really? Do you think that ARM chips will be coexisting for quite a long time alongside x86 platforms? And so it’ll almost be separate offerings on two different types of chips to enterprise users? On the one hand, they’ll be, I don’t know, ARM, on the other hand, there’ll be the “serious” cloud users; how do you see it going?

Chris Bergey (guest): Well, I can focus on what we’re gonna be offering. Right. And I think what we offer is his choice. Right? And, and I think that’s really, why we’re gonna do quite well is, if you look at what the trends, you see a lot of move towards vertical integration, right.

And I think that tightly coupled hardware and software and you see more companies just like Amazon, right, building their own silicon. Well, that’s something that fits ARM’s market model quite well.

And so I think that is something that we, that the partners appreciate about us and, they can build the product they want.

Where one thing that people don’t realize maybe is that data centers today are very different than the way on-prem data centers were built. And I mentioned just the 500-megawatt power, but it’s more than that. It’s gone away from something that was very monolithic, where you rack and stack, 2P servers, and it was just floor to ceiling row after row, 2P. That’s not how you build a cloud data center today; you actually have very much specialty hardware. Because the world was really built around general purpose and general-purpose was great because, as big as the inner Enterprise Server market was, it really needed scale, right and what the right solution was a compromise. It was a compromise that allowed it to have the scale that it had.

When you now go into cloud scale, you really start seeing the benefits of really making domain-specific compute. And that’s where you see things like GPUs or ML accelerators coming in, in these cloud data centers, you have services like S3, where storage is disaggregated, you’ve got flash disaggregated. So, you really end up with these racks of specialty hardware or specialty processors, not just general-purpose processors.

And again, that really, really fits the ARM model, as well, as, with some of the Moore’s Law, scaling challenges that we have going forward. That customization is gonna be even more required to get some of the computing benefits that we need to be able to continue to provide computing, keep up with the computing demands, without while trying to keep power consumption, or, other things in check , like costs, and those things in check.

Joe Green (host): Now, of course, at this point, if I may, I think we should make some differentiations. For our listeners, we’ve mentioned, company names, AMD and Intel, both produce a particular type of chip, largely the same in terms of overall structure, but very different between vendors. Perhaps you could explain to our listeners quite what the difference is between an air quotes traditional x86 processor creators or vendors, and an organization like ARM?

Chris Bergey (guest): sure, so, ARM is at its core, an IP development company. And what that means is that we are the curators of the ARM architecture, the ARM instruction set. And we create cores or, basically, soft IP that implements the ARM processors.

But then we hand that to a semiconductor manufacturer or, could be, people like NXP Broadcom, Qualcomm, MediaTech, you go down the list, and then they put that core into their product.

So if in my example, if Broadcom is building a router chip, a Wi-Fi router chip, they’ll get a core from ARM, they’ll put it in that Wi-Fi router chip, and it will run the Linux code or whatever the router chip is doing. And they’re able to leverage that ARM ecosystem, but you don’t buy the chip.

If you’re a router manufacturer from ARM, you buy it from Broadcom. So that’s very different than Intel or AMD, where you go to AMD or Intel is a one-stop-shop. And basically, they provide you a chip and maybe even referenced [inaudible] for a motherboard.

ARM’s approach is much more of collaboration through our ecosystem, which we think is quite powerful. Because, again, we’re able to just cover so many more alternatives that if you want a big one, a fat one, a skinny one, a red one, a blue one, basically you can you find a partner, there you go, you can get what you want.

Joe Green (host): And, that’s really the power of the ARM ecosystem. Course, the thing about the ARM licensing model is that any OEM any manufacturer can buy a license and make their own chips. Do you think that’s the way forward for these big operators?

Chris Bergey (guest): I don’t think there’s going to be one size that fits all. I think that, there is clearly value in tightly coupling, kind of, there’s this move to more of owning the stack, let’s say of kind of, hardware to software and for certain products, that there’s companies that have a certain scale, that desire to go down that path, creating semiconductors.

One of the things with the Moore’s Law scaling challenge I mentioned where it’s not only are we not getting the transistor performance that we were getting to reduce that power or get the performance, the costs are starting to go nonlinear. Or maybe I’ve always been nonlinear, but we’re getting even to a steeper part of the curve.

So I think there’s this natural balance of, the cost to build semiconductors continues to increase and which means that you’ve got to have a certain market size, and I think that’s where are the traditional semiconductor players really fill that requirement, and I don’t see that requirement going away.

I think it will just morph into other areas and, so I see there is an interest, and as I mentioned, there is interest in and as something that ARM supports for customization but, it is an expensive endeavor; you need to have significant scale and expertise.

So I think that both models will exist; I don’t see it going one way or the other in a huge way.

Joe Green (host): …and there comes the music. And that, of course, means that we’re gonna have to leave it there. It only remains for me to say, Chris Bergey of ARM, Senior Vice President of Infrastructure Line of Business, thank you ever so much for joining us on the Tech Means Business podcast.

Chris Bergey (guest): Joe, it’s been my pleasure. I look forward to talking to you again soon.

Joe Green (host): And so I turned to you now listeners. Thanks so much for joining me. We’ll be discussing more with other people from ARM over the next few months, I hope. I’ve got a few things lined up, so watch this space. Until that happens, and until the next episode of the Tech Means Business podcast, thanks for joining me, and I hope to hear from you soon. Bye.
[/showhide]

The post ARMing the low-power data center appeared first on TechWire Asia.

]]>
Talking, listening and transforming: Conversational AI by Nuance https://techwireasia.com/podcast/nuance-omnichannel-dragon-speech-ai-ml-conversational-podcast-02e10/ Wed, 11 Nov 2020 09:21:51 +0000 https://techwireasia.com/?post_type=podcast&p=206003 In this episode of the Tech Means Business podcast, we talk about conversational AI, the logical next step to the pioneering technology of voice recognition and speech to text.

The post Talking, listening and transforming: Conversational AI by Nuance appeared first on TechWire Asia.

]]>

Show Notes for Series 02 Episode 10

This podcast is produced in conjunction with Nuance Communications.
Nuance Communications is on a mission to put the human voice at the heart of every organization! Forget what you think you know about voice assistants and those hopelessly dumb ‘bots you run into when you call into some companies. Nuance is a company that pretty much invented voice recognition and voice-to-text back in the days when such tech was so new and amazing it was like magic to mere mortals!

Technology never stands still, and the field of voice tech is no exception. How can voice combine with proven AI techniques to help businesses in 2020 and beyond? The answer comes in the form of conversational AI, the closest thing to a truly proactive virtual chat you’ll get, useful in many different areas of the enterprise, from the help desk to the call center and further afield.

In this Tech Means Business podcast, we talk to Robert Schwarz of Nuance Communications (Managing Director, Enterprise & Mobile ANZ) about how voice technologies and smart, conversational AI have multiple uses in any business, like voiceprint recognition, voice-to-big-data, smart real-time help for any professional, and even prevention of large-scale fraud.

The host of the show, Joe Green, recounts his own use of Dragon Dictate and asks how Nuance Communications has transformed its technology and its platform’s functional areas, where voice assistance (and assistants) makes critical differences to customer experience, safety, and ease-of-use.

As ever, there was on the day too much to talk about, but the podcast will let you see some of the possibilities conversational AI offers and will, we hope, inspire your own voice-centric journey.

– Voiceprint as part of 2FA or MFA (multi-factor authentication).
– Interactive voice recognition that can respond intelligently when calls go “off-script.”
– Omnichannel call center operations that help customers and agents alike.
– Mapping out conversations and strategies to drive customer satisfaction.
– Smart assistants that help specialists (like physicians or IT support staff) in real-time.

Nuance is pushing the boundaries of the technology it helped invent and is still driving the possibilities in multiple verticals. Find out how, who, where, what and how.

Robert Schwarz on LinkedIn:
https://www.linkedin.com/in/robert-schwarz-4872185/

Your host’s one nod at social media:
https://www.linkedin.com/in/josephedwardgreen/

The post Talking, listening and transforming: Conversational AI by Nuance appeared first on TechWire Asia.

]]>