1
00:00:00,860 --> 00:00:03,120
Hello everyone, and
thank you for attending
2
00:00:03,120 --> 00:00:06,280
today's webinar sponsored
by Ciena and Cisco.
3
00:00:06,720 --> 00:00:08,540
Before we begin, I want to
4
00:00:08,540 --> 00:00:10,410
cover a few housekeeping items.
5
00:00:10,560 --> 00:00:12,640
On the left-hand side
of your screen is the
6
00:00:12,640 --> 00:00:14,980
Q&A box. If you have
any questions during
7
00:00:14,980 --> 00:00:17,040
the webcast, you can
type your question into
8
00:00:17,040 --> 00:00:19,400
the Q&A box and submit
them to our speakers.
9
00:00:19,480 --> 00:00:21,740
All questions will be
saved, so if we don't
10
00:00:21,740 --> 00:00:24,080
get to answer you, we
may follow up via email.
11
00:00:24,280 --> 00:00:26,540
At the bottom of your
audience, consider multiple
12
00:00:26,540 --> 00:00:29,180
application widgets you
can use. If you have
13
00:00:29,180 --> 00:00:31,320
any technical difficulties,
please click on the
14
00:00:31,320 --> 00:00:33,640
help widget. Here you can
find answers to common
15
00:00:33,640 --> 00:00:36,380
questions. A copy of
today's slide deck is
16
00:00:36,380 --> 00:00:39,120
available for download in
the resource list widget.
17
00:00:39,420 --> 00:00:41,600
Towards the end of
today's presentation,
18
00:00:41,600 --> 00:00:42,900
we'll ask for your feedback.
19
00:00:42,900 --> 00:00:45,140
A survey is already
open on your screen
20
00:00:45,140 --> 00:00:47,260
and will only take one
minute to complete.
21
00:00:47,360 --> 00:00:49,400
Your feedback is
extremely helpful.
22
00:00:49,400 --> 00:00:51,880
An on-demand version
of the webcast will
23
00:00:51,880 --> 00:00:53,900
be available about one
day after the event
24
00:00:53,900 --> 00:00:56,160
and can be accessed
using the same audience
25
00:00:56,160 --> 00:00:58,260
link that was sent
to you earlier today.
26
00:00:58,410 --> 00:01:01,300
I would now like to turn
the event over to our
27
00:01:01,300 --> 00:01:04,620
Omdia Senior Principal
Analyst, Optical Networks
28
00:01:04,620 --> 00:01:07,320
and Transport, Sterling
Perrin. Sterling?
29
00:01:08,590 --> 00:01:10,920
Thanks, Barbara. Hello.
Welcome, everybody,
30
00:01:10,920 --> 00:01:15,440
to my first webinar of the
year. Welcome to 2026. And
31
00:01:15,440 --> 00:01:19,280
our webinar on Automation
and AI Adoption for the
32
00:01:19,280 --> 00:01:24,860
transport network a look at
our late 2025 omdia survey
33
00:01:24,860 --> 00:01:29,000
results on the topic i will
be the host and moderator
34
00:01:29,000 --> 00:01:32,440
for today and i'll introduce
our speakers in a moment
35
00:01:32,440 --> 00:01:38,060
um uh here's a flow of of
what we'll be talking about
36
00:01:38,060 --> 00:01:40,820
over the next hour uh next
slide i'll talk about the
37
00:01:40,820 --> 00:01:43,620
project itself there's a
survey and other components to
38
00:01:43,620 --> 00:01:47,160
it um and then we'll get
into the meat of of what the
39
00:01:47,160 --> 00:01:50,060
survey was about, which is
looking at use cases and
40
00:01:50,060 --> 00:01:52,880
benefits around automation
and AI, looking at artificial
41
00:01:52,880 --> 00:01:56,260
intelligence, specifically
some detail there, and
42
00:01:56,260 --> 00:01:59,060
then a first for us on the
transport network looking
43
00:01:59,060 --> 00:02:02,820
at agentic AI, so some
interesting data points there.
44
00:02:03,460 --> 00:02:06,360
At the end, we'll have
questions and answers. Barbara's
45
00:02:06,360 --> 00:02:11,440
explained the process for
Q&A. Just ask as we go,
46
00:02:11,440 --> 00:02:15,400
as they come up and we'll
gather them up and get to
47
00:02:15,400 --> 00:02:17,440
as many as we can at the
end we hope to have about 10
48
00:02:17,440 --> 00:02:20,700
minutes or so at the end
always interesting to get
49
00:02:20,700 --> 00:02:23,820
your thoughts comments and
questions in our webinars
50
00:02:24,120 --> 00:02:27,980
the project itself is
survey based it's automation
51
00:02:27,980 --> 00:02:30,840
and ai for transport
it's actually the fourth
52
00:02:30,840 --> 00:02:34,660
year so some of you may
have joined our past
53
00:02:34,660 --> 00:02:38,160
webinars on results this
is from the november 2025
54
00:02:38,160 --> 00:02:41,860
survey we conducted we had
80 respondents globally
55
00:02:41,860 --> 00:02:45,240
i'll talk through the
demographics in a moment
56
00:02:45,440 --> 00:02:50,300
of of who responded but we
look at transport you know
57
00:02:50,300 --> 00:02:53,900
broadly defined ip optical
and converged ip and optical
58
00:02:54,170 --> 00:02:57,020
as the years have gone by
we've more moved more and
59
00:02:57,020 --> 00:03:00,540
more into ai as a component
of automation or complementary
60
00:03:00,540 --> 00:03:03,340
technology and as i
mentioned this year really
61
00:03:03,340 --> 00:03:07,580
getting into agentic ai AI
and how operators are starting
62
00:03:07,580 --> 00:03:13,000
to see agentic AI factoring
into their strategies.
63
00:03:13,700 --> 00:03:15,840
The other components will be a
64
00:03:15,840 --> 00:03:17,520
white paper based
on the results.
65
00:03:17,520 --> 00:03:19,700
Everybody on this webinar
will be able to get
66
00:03:19,700 --> 00:03:22,220
a copy of that white
paper. It's very detailed.
67
00:03:22,220 --> 00:03:24,980
Look at the results. I'm
writing that now. That'll
68
00:03:24,980 --> 00:03:28,880
be available in early
February. That'll go out. And
69
00:03:28,880 --> 00:03:31,200
then a couple of blogs
that highlight some of the
70
00:03:31,200 --> 00:03:33,740
results. Those will go on
light reading. Very happy to
71
00:03:33,740 --> 00:03:36,900
have both Ciena and Cisco
as our sponsors this year
72
00:03:36,900 --> 00:03:40,400
and to have participants
from each company helping
73
00:03:40,400 --> 00:03:44,060
me walk through and
interpret the results that we
74
00:03:44,060 --> 00:03:48,300
have. It's very new data,
really just before the holidays
75
00:03:48,300 --> 00:03:51,480
that this came in. The
demographics, as you can
76
00:03:51,480 --> 00:03:55,100
see here, we really like to
get a mix of what we define
77
00:03:55,100 --> 00:03:59,380
as Tier 1 and Tier 2 and
3 operators. So Tier 1,
78
00:03:59,380 --> 00:04:01,460
we define as a billion
dollars or more in annual
79
00:04:01,460 --> 00:04:04,620
revenue, and a little over
half the audience for this
80
00:04:04,620 --> 00:04:08,100
survey is classified as
Tier 1. So we'll show some
81
00:04:08,100 --> 00:04:10,920
splits in the data there.
And then another one, of
82
00:04:10,920 --> 00:04:14,180
course, is North American
operators versus other regions,
83
00:04:14,180 --> 00:04:16,480
a little over half North
America, which is typical
84
00:04:16,480 --> 00:04:18,620
in our surveys. And then
we've got representation
85
00:04:18,620 --> 00:04:22,440
from EMEA, Asia-Pac, and
Central and Latin America.
86
00:04:23,040 --> 00:04:26,320
let me there's more
demographics detail that'll be
87
00:04:26,320 --> 00:04:28,380
in the white paper so
you can see full but this
88
00:04:28,380 --> 00:04:31,140
kind of gives you a high
level picture of who's
89
00:04:31,140 --> 00:04:34,360
responding let me introduce
our speakers here happy to
90
00:04:34,360 --> 00:04:37,260
have marie fiala director
of portfolio marketing
91
00:04:37,260 --> 00:04:40,560
with sienna representing
sienna today hi marie welcome
92
00:04:40,610 --> 00:04:45,640
hi hello everyone good
to have you on i look
93
00:04:45,640 --> 00:04:48,260
forward to your thoughts and
then she is joined by omar
94
00:04:48,260 --> 00:04:51,020
sultan director of product
management automation
95
00:04:51,020 --> 00:04:54,420
and AI with Cisco. Hi,
Omar. Good morning to you.
96
00:04:54,980 --> 00:04:56,600
Good morning, and thanks to
97
00:04:56,600 --> 00:04:57,680
folks for joining us today.
98
00:04:58,660 --> 00:05:01,540
Yeah, looking forward
to it. Let me, moving
99
00:05:01,540 --> 00:05:05,420
into the content of the
survey, use cases and
100
00:05:05,420 --> 00:05:07,440
benefits, some of the
high-level stuff before we
101
00:05:07,440 --> 00:05:12,300
get into more in the
AI and agentic AI bits.
102
00:05:13,500 --> 00:05:15,180
I'll present a couple
here, and then we'll
103
00:05:15,180 --> 00:05:17,860
get into the discussion.
But this is a
104
00:05:17,860 --> 00:05:20,760
question we actually
also asked last year, so
105
00:05:20,760 --> 00:05:23,280
it's good to kind of
compare year to year.
106
00:05:24,360 --> 00:05:26,680
This is looking at the
state of automation
107
00:05:26,680 --> 00:05:29,780
in telecom. Our
audience is, we call
108
00:05:29,780 --> 00:05:31,500
them communication
service providers.
109
00:05:31,620 --> 00:05:33,560
Telcos is kind
of the older term
110
00:05:33,560 --> 00:05:35,820
used for that,
telecom operators.
111
00:05:36,000 --> 00:05:40,400
And we base this question
on five levels of autonomous
112
00:05:40,400 --> 00:05:43,380
networking, which is adapted
from the tm forum which
113
00:05:43,380 --> 00:05:46,000
itself adapted from the
automotive industry i think
114
00:05:46,000 --> 00:05:50,120
is most are aware it's
not a perfect model but um
115
00:05:50,480 --> 00:05:53,660
people know it so if you
ask a question around these
116
00:05:53,660 --> 00:05:57,420
five levels people have a
general sense so uh we ask
117
00:05:57,420 --> 00:06:00,880
it uh and this is the results
from from this year if
118
00:06:00,880 --> 00:06:06,280
you look at the today picture
70 of operators are in
119
00:06:06,280 --> 00:06:10,340
the partial automation phase
uh or below on the global
120
00:06:10,340 --> 00:06:13,470
results so that's currently
the the dark blue is
121
00:06:13,470 --> 00:06:16,680
is um where are you today
in your phase of automation
122
00:06:16,680 --> 00:06:19,940
uh level two is uh partial
automation it's really
123
00:06:19,940 --> 00:06:22,460
the earliest stage of what
we would call true automation
124
00:06:22,460 --> 00:06:25,380
this is where an introduction
of ai and machine
125
00:06:25,380 --> 00:06:30,200
learning start to come into
the network um if you look
126
00:06:30,200 --> 00:06:35,860
out to 2028 the pinkish
uh shade here in our color
127
00:06:35,860 --> 00:06:40,830
screen 50 color scheme 56 percent
expect to be conditionally
128
00:06:40,840 --> 00:06:44,580
autonomous or higher
by the end of 2028 over
129
00:06:44,580 --> 00:06:47,260
the next three years that
would be either level three
130
00:06:47,260 --> 00:06:50,800
level four or level five
in the the tm forum scheme
131
00:06:51,540 --> 00:06:55,580
level three is where the
the largest group expects
132
00:06:55,580 --> 00:06:59,140
to be 26 percent highly
autonomous i mean i'm
133
00:06:59,140 --> 00:07:01,340
sorry level four over
the next three years is
134
00:07:01,340 --> 00:07:04,520
where the bulk of the
audience expects to be at
135
00:07:04,520 --> 00:07:07,180
least the largest
individual share. Highly
136
00:07:07,180 --> 00:07:11,420
autonomous is level four.
So fairly aggressive plans,
137
00:07:11,420 --> 00:07:13,660
actually very consistent
with last year's
138
00:07:13,660 --> 00:07:17,100
results in terms of that
progression. So operators
139
00:07:17,560 --> 00:07:20,920
are surely consistent and
have something in mind of
140
00:07:20,920 --> 00:07:24,280
where they want to go. Level
four, not so much in level
141
00:07:24,280 --> 00:07:27,140
five, which we can certainly
discuss as we go through.
142
00:07:27,640 --> 00:07:32,660
The other, as I mentioned
up front, we talk about
143
00:07:32,660 --> 00:07:34,960
Tier 1 versus Tier 2
and Tier 3. I'll just
144
00:07:34,960 --> 00:07:37,880
highlight this here. It
is fairly consistent. The
145
00:07:37,880 --> 00:07:42,300
Tier 1 operators are more
aggressive in moving to
146
00:07:42,300 --> 00:07:45,800
AI versus their Tier 2
and Tier 3 counterparts.
147
00:07:45,800 --> 00:07:48,360
Again, we saw the same
thing last year. So this
148
00:07:48,360 --> 00:07:51,070
is the same question. I
just broke it out by the
149
00:07:51,160 --> 00:07:54,460
operator size. Tier 1,
dark blue, teal is the
150
00:07:54,460 --> 00:07:56,700
Tier 2 and 3s. and then
the left is the current
151
00:07:56,700 --> 00:08:01,880
state, and the right is
the expected state three
152
00:08:01,880 --> 00:08:05,020
years from now. You can
see the Tier 2s and 3s are
153
00:08:05,020 --> 00:08:07,680
significantly less advanced
than Tier 1s today.
154
00:08:07,980 --> 00:08:10,340
Probably, you know, as
you might expect, these
155
00:08:10,340 --> 00:08:12,640
are smaller operators,
less sophisticated,
156
00:08:12,800 --> 00:08:15,260
but a pretty big gap of
where they are today.
157
00:08:15,540 --> 00:08:16,680
On the right,
158
00:08:16,960 --> 00:08:19,220
interesting, you know, Tier
1s are still expected to
159
00:08:19,220 --> 00:08:21,600
be more advanced three
years out from now, again as
160
00:08:21,600 --> 00:08:24,780
you would expect but um
the the takeaway that that
161
00:08:24,780 --> 00:08:27,060
i had particularly on this
one is that the tier two
162
00:08:27,060 --> 00:08:30,010
threes have pretty
significant expectations for
163
00:08:30,010 --> 00:08:33,980
moving to automation uh over
the next three years so as
164
00:08:33,980 --> 00:08:35,580
we go through the discussion
we can kind of keep
165
00:08:35,580 --> 00:08:40,080
that in mind and you know
tease out maybe how how how
166
00:08:40,080 --> 00:08:42,380
vendors like cisco and sienna
and others can can help
167
00:08:42,380 --> 00:08:45,120
these smaller operators uh
get to where they clearly
168
00:08:45,120 --> 00:08:47,280
want to go but but aren't
uh certainly don't have
169
00:08:47,280 --> 00:08:50,820
the resources currently
to do it uh with that let
170
00:08:50,820 --> 00:08:54,600
me uh let me jump into the
discussion on this one um
171
00:08:54,740 --> 00:08:59,040
and omar i'll let you lead
this one was on use cases
172
00:08:59,040 --> 00:09:02,120
and just for background
for the audience so we we
173
00:09:02,120 --> 00:09:04,780
have a set of 11 use cases
you can see them here and
174
00:09:05,110 --> 00:09:08,260
their standard set that we
came up with that we used
175
00:09:08,260 --> 00:09:11,060
the same ones last year
they're not perfect but it
176
00:09:11,060 --> 00:09:14,900
gives a a good you know
standardized metric across
177
00:09:14,900 --> 00:09:17,720
multiple questions where
we can ask about use cases
178
00:09:17,800 --> 00:09:20,760
uh the today picture network
performance monitoring
179
00:09:20,760 --> 00:09:24,640
and inventory are topping
the list of where automation
180
00:09:24,640 --> 00:09:28,480
is today looking out uh
tomorrow which means over the
181
00:09:28,480 --> 00:09:30,680
next three years there are
really strong expectations
182
00:09:30,680 --> 00:09:33,600
to adopt almost all of the
use cases really all of
183
00:09:33,600 --> 00:09:38,500
them but 80 i highlight here
um expect to adopt 10 of
184
00:09:38,500 --> 00:09:41,520
the 11. the one that was
lagging a bit was network
185
00:09:41,520 --> 00:09:48,780
design and planning, but
still quite impressive. Omar,
186
00:09:49,160 --> 00:09:51,280
thoughts and comments on this
187
00:09:51,280 --> 00:09:53,460
one? I'll let you start it off.
188
00:09:54,200 --> 00:09:56,880
Yeah, so I think one of
the key things is there's
189
00:09:56,880 --> 00:09:58,740
nothing like really
esoteric here. This is all
190
00:09:58,740 --> 00:10:02,710
bread and butter
operations for a transport
191
00:10:02,860 --> 00:10:06,160
network, and you kind of
see the investment across
192
00:10:06,860 --> 00:10:14,060
all these use cases in
that pink-colored bar. And
193
00:10:14,060 --> 00:10:15,800
this is kind of what we
see, too. The initial use
194
00:10:15,800 --> 00:10:19,340
cases were like more
augmenting staff with AI
195
00:10:19,340 --> 00:10:22,440
to help them do kind of
the humdrum work, do it
196
00:10:22,440 --> 00:10:26,340
faster or more efficiently
or offload altogether.
197
00:10:26,440 --> 00:10:29,780
I think the other piece
is things like planning
198
00:10:29,780 --> 00:10:32,780
and design and even
testing management.
199
00:10:33,380 --> 00:10:33,980
you know,
200
00:10:34,300 --> 00:10:36,020
less traction
there. I think those
201
00:10:36,020 --> 00:10:37,080
are the kinds of
things where maybe
202
00:10:37,080 --> 00:10:38,640
people should be
spending their time,
203
00:10:38,760 --> 00:10:40,180
kind of the higher
order functions
204
00:10:40,180 --> 00:10:44,100
and, you know, kind
of the automation
205
00:10:44,620 --> 00:10:46,660
and implementation
actually frees them,
206
00:10:46,660 --> 00:10:48,220
allows them to do
those kinds of things.
207
00:10:49,980 --> 00:10:51,880
Yeah, and then one
point I'll make, so
208
00:10:51,880 --> 00:10:53,520
these are a set of
use cases. And so
209
00:10:53,520 --> 00:10:55,250
when you see there's
lack of adoption,
210
00:10:55,460 --> 00:10:57,140
it doesn't mean that operators
211
00:10:57,140 --> 00:10:59,280
aren't interested
in that use case.
212
00:10:59,420 --> 00:11:01,620
And this is, you know,
you understand it. I just
213
00:11:01,620 --> 00:11:03,040
want to make it clear for
the audience. It doesn't
214
00:11:03,040 --> 00:11:05,220
mean that the use case
isn't of interest. This is
215
00:11:05,220 --> 00:11:07,890
specifically trying to get
where automation, where
216
00:11:07,890 --> 00:11:10,740
they see the applicability
of automation. And
217
00:11:10,740 --> 00:11:13,500
then the question is, you
know, are there things
218
00:11:13,500 --> 00:11:17,720
that are missing or not in
applying AI to automation?
219
00:11:17,720 --> 00:11:21,000
But it's not any time
there's a low finding on
220
00:11:21,000 --> 00:11:22,280
this or any of the questions.
It doesn't mean the
221
00:11:22,280 --> 00:11:24,520
use case isn't important.
It's, you know, we're
222
00:11:24,520 --> 00:11:27,580
really drilling into
automation and AI a bit later.
223
00:11:27,680 --> 00:11:29,360
Marie, any comments
or thoughts on this
224
00:11:29,360 --> 00:11:31,080
one before we go to
the next data point?
225
00:11:31,890 --> 00:11:34,440
well the the one thing
i wanted to point out
226
00:11:34,440 --> 00:11:36,960
here it was kind of
surprising to me because
227
00:11:37,200 --> 00:11:40,220
network deployment and
commissioning that use case
228
00:11:40,220 --> 00:11:45,020
was lower at 21 percent and
when it comes to automation
229
00:11:45,020 --> 00:11:47,820
we you know we at santa
we're actually seeing a
230
00:11:47,820 --> 00:11:50,500
real driver for automating
those processes because
231
00:11:50,500 --> 00:11:54,160
of this scaling that's
happening especially uh for
232
00:11:54,160 --> 00:11:57,960
data center build out and
data center connectivity,
233
00:11:58,160 --> 00:12:00,280
right? The high capacity
connectivity across
234
00:12:00,280 --> 00:12:04,100
data centers for
expanding AI training
235
00:12:04,100 --> 00:12:07,220
geographically. So we're
actually seeing, like,
236
00:12:07,220 --> 00:12:09,760
despite that being
21%, we're seeing a lot
237
00:12:09,760 --> 00:12:11,800
of interest in automating
those processes.
238
00:12:13,180 --> 00:12:15,760
So this is another
good point, Marie.
239
00:12:16,920 --> 00:12:21,780
The other thing about the
audience for this survey
240
00:12:21,780 --> 00:12:26,660
is it's very, very telco
heavy. um yeah and not uh
241
00:12:26,660 --> 00:12:30,020
hyperscaler or neocloud uh
really much representation
242
00:12:30,020 --> 00:12:33,800
at all so it's interesting
so you know ai uh
243
00:12:33,800 --> 00:12:38,120
connectivity dci certainly
telcos are are in play in
244
00:12:38,120 --> 00:12:41,300
there to a degree but as
you know the bulk of it
245
00:12:41,300 --> 00:12:44,540
right now is being dominated
by the hyperscalers so
246
00:12:45,000 --> 00:12:48,200
interesting that
maybe the hyperscalers
247
00:12:48,200 --> 00:12:50,140
certainly the hyperscalers
and maybe telcos that
248
00:12:50,140 --> 00:12:52,020
are serving that particular
application might
249
00:12:52,020 --> 00:12:55,500
be seeing something
that for the broader set
250
00:12:55,500 --> 00:12:58,800
of, you know, telco,
they're not there yet.
251
00:13:00,780 --> 00:13:03,920
So, you know,
whether they're wrong
252
00:13:03,920 --> 00:13:05,320
to be missing it,
I don't know. But
253
00:13:05,320 --> 00:13:06,920
interesting,
interesting observation.
254
00:13:08,320 --> 00:13:11,140
Yeah, anyway, go ahead, Omar.
255
00:13:11,320 --> 00:13:13,080
I mean, if you look
at the adoption data,
256
00:13:13,080 --> 00:13:15,060
right, there's still
a good chunk of
257
00:13:15,060 --> 00:13:17,540
respondents that are
still zero to one,
258
00:13:18,120 --> 00:13:20,480
zero to two in terms of
where they are, where
259
00:13:20,480 --> 00:13:22,540
they want to be. And
I think that's some
260
00:13:22,580 --> 00:13:24,800
of the stuff I'm
referring to is like this
261
00:13:24,800 --> 00:13:27,180
and just bread and
butter automation stuff
262
00:13:27,960 --> 00:13:30,840
that you can do today
that folks are still
263
00:13:30,840 --> 00:13:33,020
trying to just get
the basics in place.
264
00:13:34,400 --> 00:13:36,560
Very much bread and butter, I
265
00:13:36,560 --> 00:13:38,540
think is the way to look at it.
266
00:13:39,740 --> 00:13:42,780
And if you compare it to
how these, I don't have a
267
00:13:42,780 --> 00:13:45,440
slide for it, but if you look
a little bit of, you know,
268
00:13:45,440 --> 00:13:49,540
how they answered last
year there is a bit of a
269
00:13:49,540 --> 00:13:52,940
movement there so i i think
um you know some of them were
270
00:13:52,940 --> 00:13:56,280
certainly still in the
mix but um i think they're
271
00:13:56,280 --> 00:13:58,760
still you know kind of
finding their footing about
272
00:13:58,760 --> 00:14:03,400
where automation uh applies
to a degree maybe some of
273
00:14:03,400 --> 00:14:06,310
that you know showing showing
through in the results um
274
00:14:07,360 --> 00:14:10,760
let me uh move to the next
one so this one was on uh
275
00:14:10,760 --> 00:14:14,200
digital twin marie i'll let
you kick this one off um
276
00:14:14,410 --> 00:14:18,560
so the same thing we took
the 11 use cases that we
277
00:14:18,560 --> 00:14:21,800
sort of standardized and
asked about where digital twin
278
00:14:21,800 --> 00:14:24,220
is is most applicable and
of course you're going to
279
00:14:24,220 --> 00:14:26,520
get different results from
where you know applications
280
00:14:26,520 --> 00:14:29,340
are are today so it's it
shouldn't be surprising
281
00:14:29,340 --> 00:14:31,080
that the list is going to
look different from the last
282
00:14:31,080 --> 00:14:34,320
slide this is digital twin
here network uh optimization
283
00:14:34,320 --> 00:14:36,320
traffic engineering
and network performance
284
00:14:36,320 --> 00:14:39,340
monitoring were the top ones
on the list and we had an
285
00:14:39,340 --> 00:14:42,920
additional six as you can see
kind of going down the list
286
00:14:42,920 --> 00:14:45,420
where they're ordered from
you know most applicability
287
00:14:45,420 --> 00:14:49,120
to least you can see where
where it resonated um
288
00:14:49,940 --> 00:14:54,820
uh and they didn't uh just
as the the quick um you
289
00:14:54,820 --> 00:14:58,440
know look it didn't correlate
specifically with with
290
00:14:58,440 --> 00:15:00,600
the overall transport use
case priorities on the
291
00:15:00,600 --> 00:15:03,180
last slide so digital twin
is seems to be applying for
292
00:15:03,180 --> 00:15:05,680
specific things but the
only other comment i'll
293
00:15:05,680 --> 00:15:08,560
make here before i let
marie comment is um that 36
294
00:15:08,560 --> 00:15:12,660
percent not uh having
plans for digital twin for
295
00:15:12,660 --> 00:15:17,640
transport i think that's
fairly high um we we noted it
296
00:15:17,640 --> 00:15:20,700
last year also some some
challenges to digital twins
297
00:15:20,700 --> 00:15:23,060
specific to transport so
i think that's showing it
298
00:15:23,060 --> 00:15:27,280
up as well um but maria
i'll let you open open it up
299
00:15:27,280 --> 00:15:30,880
first the use cases and
and how they resonate with
300
00:15:30,880 --> 00:15:34,120
what you see and then also
maybe addressing the um
301
00:15:37,860 --> 00:15:38,700
Yeah,
302
00:15:39,920 --> 00:15:42,120
the one thing I wanted
to note, like, it is just
303
00:15:42,120 --> 00:15:44,680
a one-year horizon here.
The way the question
304
00:15:44,680 --> 00:15:47,380
was phrased here is over
the next year or so.
305
00:15:48,220 --> 00:15:51,600
So kind of in the very
short term, there's
306
00:15:51,600 --> 00:15:53,560
like one-third of
operators. You're right,
307
00:15:53,560 --> 00:15:56,400
that was, you know,
it's quite high that
308
00:15:56,400 --> 00:15:59,320
a third have no plans
for digital twin.
309
00:15:59,320 --> 00:16:02,320
But it really wasn't that
surprising to me. just that
310
00:16:02,320 --> 00:16:05,400
there is some confusion
around um what digital twin
311
00:16:05,400 --> 00:16:08,610
is or isn't um over and
above you know simulation
312
00:16:08,840 --> 00:16:15,860
capabilities and um the uh the
so i'll spend a bit of time
313
00:16:15,860 --> 00:16:18,760
on just on the next slide
just uh going over how how
314
00:16:18,760 --> 00:16:22,640
it works in practice but um
with respect to the results
315
00:16:22,640 --> 00:16:26,060
here i i did see that
network design and planning
316
00:16:26,060 --> 00:16:31,180
it was um quite high on
the list and in comparison
317
00:16:31,180 --> 00:16:35,590
though for automation, it
was low on the list. So what
318
00:16:35,590 --> 00:16:39,560
that really brought home
to me was that, you know,
319
00:16:39,560 --> 00:16:42,360
operators, they're looking
to use network digital twin
320
00:16:42,360 --> 00:16:45,860
for those planning use cases
to get better confidence
321
00:16:45,860 --> 00:16:48,740
in the results before they
can even hope to automate.
322
00:16:48,740 --> 00:16:51,630
So that was one
that really, that
323
00:16:51,920 --> 00:16:53,760
was a point that
really struck me.
324
00:16:56,040 --> 00:17:00,780
Yeah, good point on the
one-year horizon too. i i
325
00:17:00,780 --> 00:17:02,840
should have noted that a
lot of times we ask about
326
00:17:02,840 --> 00:17:04,820
three years this year this
time we really wanted to
327
00:17:04,820 --> 00:17:09,370
know these are very very
near-term plans so um that
328
00:17:09,370 --> 00:17:12,730
that certainly would would
impact the 36 percent um
329
00:17:12,730 --> 00:17:16,260
yeah any uh i know you want
to kind of walk through on
330
00:17:16,260 --> 00:17:19,620
the next slide um before
you do that i'll let omar
331
00:17:19,620 --> 00:17:21,840
comment um although marie
did you have anything else
332
00:17:21,840 --> 00:17:23,740
to say on the results and
then i'll kick it over
333
00:17:23,740 --> 00:17:29,820
to omar omar any thoughts
on this one uh so i do think
334
00:17:29,820 --> 00:17:33,280
digital twin is is quite
overloaded in terms of
335
00:17:33,690 --> 00:17:37,420
multiple definitions you
know I think you can have
336
00:17:37,420 --> 00:17:40,960
digital twins with varying
levels of fidelity and how
337
00:17:40,960 --> 00:17:44,980
rich in a complex environment
they can model I think
338
00:17:44,980 --> 00:17:47,180
from our perspective we do
see like a high fidelity
339
00:17:47,180 --> 00:17:51,460
digital twin is critical
to an AI strategy I think
340
00:17:51,460 --> 00:17:54,170
for customers have trust in
their systems they they're
341
00:17:54,170 --> 00:17:56,480
gonna want to be able to
test a proposed change
342
00:17:56,480 --> 00:17:59,380
and see how it behaves
before pushing the production
343
00:17:59,380 --> 00:18:01,960
if it can really automate
that kind of tool chain.
344
00:18:01,960 --> 00:18:04,320
You know, they need
that kind of ability
345
00:18:04,320 --> 00:18:07,300
to test and trust
before they deploy,
346
00:18:07,520 --> 00:18:09,080
which is ultimately
going to drive
347
00:18:09,080 --> 00:18:11,560
the need for a
very high-fidelity
348
00:18:11,560 --> 00:18:13,780
digital twin to be
able to pull it off.
349
00:18:14,920 --> 00:18:19,440
Yeah, and I do think some
operators are still trying
350
00:18:19,440 --> 00:18:22,140
to understand, you know,
even still what digital twin
351
00:18:22,140 --> 00:18:27,480
is as well. Fairly new
concept, I think, for many.
352
00:18:28,180 --> 00:18:30,460
With that, why don't
I let, Marie, I'll
353
00:18:30,460 --> 00:18:34,000
let you walk through a
bit more detail here,
354
00:18:34,000 --> 00:18:35,280
and then we'll go
to the next section.
355
00:18:36,360 --> 00:18:40,140
Sure. And this really picks
up what Omar was talking
356
00:18:40,140 --> 00:18:43,440
about, about building
trust. So just wanted to
357
00:18:43,440 --> 00:18:45,280
spend a bit of time, you
know, so how does it really
358
00:18:45,280 --> 00:18:48,540
work? So a lot of people
in the audience here,
359
00:18:48,540 --> 00:18:51,080
you're familiar that a
digital twin is a virtual
360
00:18:51,080 --> 00:18:54,100
representation of the actual
details of the physical
361
00:18:54,100 --> 00:18:56,080
network, the live network,
everything from the
362
00:18:56,080 --> 00:18:58,780
network elements, configuration,
behavior, et cetera.
363
00:18:58,780 --> 00:19:01,640
And these are
complex, right? Like
364
00:19:01,640 --> 00:19:03,680
the physical transport network,
365
00:19:03,860 --> 00:19:07,320
it has multi-layer IP
optical infrastructure from
366
00:19:07,320 --> 00:19:09,940
different vendors, and
there's really a mix of legacy
367
00:19:09,940 --> 00:19:13,900
and modern devices. So some
only have TL1 interfaces,
368
00:19:14,030 --> 00:19:17,400
some are GNI enabled for
real-time streaming, et
369
00:19:17,400 --> 00:19:19,800
cetera. I have to take all
of that into account. So
370
00:19:19,800 --> 00:19:23,540
it's quite difficult to
construct and implement a
371
00:19:23,540 --> 00:19:27,100
network digital twin and
have it accurate now what
372
00:19:27,100 --> 00:19:29,680
i'm showing here so it's
all about the good data
373
00:19:29,680 --> 00:19:32,760
what the network can provide
up order to the control
374
00:19:32,760 --> 00:19:36,360
layer and i'm showing
here conceptually two
375
00:19:36,360 --> 00:19:38,660
streams but it could be
one single stream it could
376
00:19:38,660 --> 00:19:41,880
actually be a common data
source that both the network
377
00:19:41,880 --> 00:19:44,710
digital twin and the
controller really consume
378
00:19:45,940 --> 00:19:50,640
consume the same stream
so again just an abstract
379
00:19:51,690 --> 00:19:53,640
representation here.
But when you have an
380
00:19:53,640 --> 00:19:56,080
accurate twin, you can
execute all sorts of what
381
00:19:56,080 --> 00:19:59,960
-if scenarios to really
risk-free tryout
382
00:19:59,960 --> 00:20:03,420
solutions before applying
them to the network.
383
00:20:04,400 --> 00:20:06,980
And so what really
makes it interesting
384
00:20:06,980 --> 00:20:09,220
is when you bring
AI into the picture.
385
00:20:09,400 --> 00:20:12,080
And Omar stated that
as well. Really,
386
00:20:12,080 --> 00:20:15,660
AI engines, AI,
agentic AI frameworks,
387
00:20:16,100 --> 00:20:18,060
network digital twin
and controllers that can
388
00:20:18,060 --> 00:20:20,240
all work together to
bring about that optimal
389
00:20:20,640 --> 00:20:25,760
outcome they can you know
using various various
390
00:20:25,760 --> 00:20:29,680
standards-based interfaces
very accepted interfaces like
391
00:20:29,680 --> 00:20:33,100
when it comes to ai agent
to agent interfaces and
392
00:20:33,100 --> 00:20:37,660
whatnot so ai generated
recommendations when you put
393
00:20:37,660 --> 00:20:40,330
that in the mix they can
be really safely validated
394
00:20:40,330 --> 00:20:42,940
in a digital twin environment
before they're enacted
395
00:20:42,940 --> 00:20:46,920
on the network so if
network digital twins aren't
396
00:20:46,920 --> 00:20:50,900
in that one-year time horizon
of adoption. As operators
397
00:20:50,900 --> 00:20:53,880
do adopt AI strategies,
they're going to become
398
00:20:53,880 --> 00:20:57,380
more and more important to
really build that trust.
399
00:20:57,900 --> 00:21:01,360
Yeah, so that kind
of summarizes my
400
00:21:01,970 --> 00:21:03,580
recovery. Yeah,
great. Thank you.
401
00:21:03,900 --> 00:21:09,740
And trust throughout is
quite a theme, I think,
402
00:21:09,740 --> 00:21:13,080
in the results and AI
as well as you and Omar
403
00:21:13,080 --> 00:21:16,720
has just brought out. So
I think quite rightly,
404
00:21:16,720 --> 00:21:18,860
we focused a lot of
the survey this year on
405
00:21:18,860 --> 00:21:21,740
artificial intelligence.
So let's kind of dig into
406
00:21:22,060 --> 00:21:24,240
some of the details here.
This is not the first
407
00:21:24,240 --> 00:21:26,740
year. Last year, we
focused a fair amount on AI
408
00:21:26,740 --> 00:21:29,400
as well, but we certainly
built on it this year.
409
00:21:30,010 --> 00:21:31,780
And the first thing,
of course, you
410
00:21:31,780 --> 00:21:35,180
want to look at
for AI adoption for
411
00:21:35,180 --> 00:21:38,320
transport is what
stage are operators at?
412
00:21:38,480 --> 00:21:41,220
So that was the question
here for the global group.
413
00:21:41,280 --> 00:21:45,220
More than half of the
service providers surveyed um
414
00:21:45,220 --> 00:21:49,240
habit uh that that middle
bar uh you could see uh quite
415
00:21:49,240 --> 00:21:53,040
further along than than the
other ones is is you know
416
00:21:53,040 --> 00:21:55,860
what i would call more
the um the early phases of
417
00:21:55,860 --> 00:21:58,920
production networks so you're
seeing ai into production
418
00:21:58,920 --> 00:22:02,720
networks for transport
almost half of them um a very
419
00:22:02,720 --> 00:22:05,920
small share as you might
expect in advanced phases um
420
00:22:05,920 --> 00:22:10,180
and still you know 20 not
not yet using ai uh and i'll
421
00:22:10,180 --> 00:22:12,420
do a bit of a drill down
on the next slide but marie
422
00:22:12,420 --> 00:22:16,400
um maybe let you comment on
these results in terms of
423
00:22:16,400 --> 00:22:20,040
as expected not expected
or any other thoughts on on
424
00:22:20,040 --> 00:22:24,360
how this came in yeah this
this really does reflect
425
00:22:24,360 --> 00:22:26,860
what we're seeing in our
customer engagements you know
426
00:22:26,860 --> 00:22:30,520
operators are trying out ai
focusing on specific pilot
427
00:22:30,520 --> 00:22:34,040
use cases because it really
takes a a fair amount of
428
00:22:34,040 --> 00:22:37,620
retooling of infrastructure
and we'll get into that
429
00:22:37,620 --> 00:22:40,740
that there's you know privacy
concerns regarding cloud
430
00:22:40,740 --> 00:22:44,480
access to large language
models etc so it takes a
431
00:22:44,480 --> 00:22:47,340
fair amount of retailing and
also retailing of processes
432
00:22:47,340 --> 00:22:52,480
to incorporate AI so and
so it does take time so I
433
00:22:52,480 --> 00:22:55,680
guess I wasn't you know it's
good that there is such a
434
00:22:55,680 --> 00:22:59,340
high number actually trying
it out for specific uses
435
00:23:00,600 --> 00:23:04,260
yeah i mean you know
certainly we we had a little
436
00:23:04,260 --> 00:23:06,700
bit of discussion on the
hype certainly i don't think
437
00:23:06,700 --> 00:23:09,260
anybody's going to
disagree the telco audience
438
00:23:09,260 --> 00:23:13,340
is going to be more conservative
than the hyperscalers
439
00:23:13,340 --> 00:23:17,840
for for many many many
reasons so um it always
440
00:23:17,840 --> 00:23:19,980
depends on kind of what
perspective you're going in
441
00:23:19,980 --> 00:23:22,780
but i i would say i would
agree a result like this
442
00:23:22,780 --> 00:23:27,320
is is is good news i think
considering it's telco
443
00:23:27,320 --> 00:23:30,380
they're they're they're
moving ahead at uh i think
444
00:23:30,380 --> 00:23:36,910
uh probably a reasonable
um pace um let me um pull
445
00:23:36,910 --> 00:23:41,100
the next one here on
and this was um this was
446
00:23:41,100 --> 00:23:43,620
another one of looking at
the tier ones versus the tier
447
00:23:43,620 --> 00:23:46,280
two threes and and i
wanted to highlight it here
448
00:23:46,540 --> 00:23:49,460
because just as an automation
we're seeing different
449
00:23:49,460 --> 00:23:51,580
expectations when we look
at the tier one operators
450
00:23:51,580 --> 00:23:55,360
versus the tier two threes
in fact it's quite quite
451
00:23:55,360 --> 00:23:59,480
significant uh on the left
i circled you can see 32
452
00:23:59,480 --> 00:24:03,220
percent of the um tier two
threes not yet using uh ai
453
00:24:03,220 --> 00:24:06,780
for transport versus just
10 percent for uh for the
454
00:24:06,780 --> 00:24:10,740
tier one so the tier ones
are clearly driving not just
455
00:24:10,740 --> 00:24:13,080
automation but particularly
the the ai adoption
456
00:24:13,080 --> 00:24:17,480
versus uh the the smaller
operators um marie i'll let
457
00:24:17,480 --> 00:24:20,180
you comment here and then
um omar if if you wanted to
458
00:24:20,180 --> 00:24:22,420
to share any thoughts but
what you know one one thing
459
00:24:22,420 --> 00:24:29,820
i would be interested in
is uh how vendors can help
460
00:24:29,820 --> 00:24:33,280
if at all uh specifically
the tier two tier threes
461
00:24:33,280 --> 00:24:36,320
move move a bit more
aggressively and you know more
462
00:24:36,320 --> 00:24:41,080
in line with the larger
operators yeah so i'll pick
463
00:24:41,080 --> 00:24:43,580
up on that in fact like
the tier two tier threes
464
00:24:43,580 --> 00:24:46,320
sometimes they're they're
more agile than the tier ones
465
00:24:46,320 --> 00:24:51,220
and so one way that we are
seeing is they've actually
466
00:24:51,220 --> 00:24:54,220
the smaller operators
have really adopted cloud
467
00:24:54,220 --> 00:24:57,860
infrastructure more quickly
like they're even using you
468
00:24:57,860 --> 00:25:01,040
know for example um like
network control solutions that
469
00:25:01,040 --> 00:25:03,260
are hosted in the in the
cloud and that makes it
470
00:25:03,260 --> 00:25:07,050
easier for them to adopt
ai technologies because ai
471
00:25:07,050 --> 00:25:11,220
applications are hosted in
the cloud so in fact um it's
472
00:25:11,320 --> 00:25:14,680
somewhat easier like
despite them despite these
473
00:25:14,680 --> 00:25:17,860
results showing that they're
um they're they're not at
474
00:25:17,860 --> 00:25:21,380
the that's ai adoption stage
these are the tier ones
475
00:25:21,380 --> 00:25:25,300
they will catch up quicker
i think because they're
476
00:25:25,300 --> 00:25:28,080
all familiar with the cloud
they don't they typically
477
00:25:28,370 --> 00:25:32,140
are again much more open
to uh already leveraging
478
00:25:32,140 --> 00:25:34,180
the cloud for all
sorts of applications
479
00:25:34,200 --> 00:25:36,120
Yeah, yeah, good points.
480
00:25:36,260 --> 00:25:37,800
Omar, any thoughts here?
481
00:25:38,470 --> 00:25:40,760
Yeah, I'll just kind of
echo that, maybe say it more
482
00:25:40,760 --> 00:25:45,430
strongly. I think for Tier
2, Tier 2, Tier 3, they
483
00:25:45,430 --> 00:25:48,860
actually have the
opportunity to leapfrog and
484
00:25:48,860 --> 00:25:51,700
maybe go from Tier 1 to,
you know, Level 1, Level 2,
485
00:25:51,700 --> 00:25:54,600
to Level 3, Level 4,
because they can, you know,
486
00:25:54,600 --> 00:25:56,600
they're simpler environments,
simpler operations. They
487
00:25:56,600 --> 00:25:59,580
can consume off-the-shelf
solutions better than
488
00:25:59,580 --> 00:26:02,280
Tier 1s or, you know,
almost everything they do is
489
00:26:02,280 --> 00:26:04,960
bespoke. So there's a lot
more heavy lifting for
490
00:26:05,040 --> 00:26:08,720
those operators to move
through the autonomy levels.
491
00:26:09,800 --> 00:26:12,180
Yeah, interesting,
interesting perspective. Let
492
00:26:12,180 --> 00:26:16,180
me, we have a couple of
use cases we want to go
493
00:26:16,180 --> 00:26:19,380
through. I think both of
you guys have a use case.
494
00:26:19,380 --> 00:26:22,220
Before we get there, let
me, let's just kind of
495
00:26:22,220 --> 00:26:25,120
get into the use cases a
bit. Omar, I'll let you
496
00:26:25,120 --> 00:26:28,280
start this one. So we
asked which use cases are
497
00:26:28,280 --> 00:26:32,100
expected to benefit the
most from AI. So, again,
498
00:26:32,100 --> 00:26:34,240
this is the standard set
of use cases, and this
499
00:26:34,240 --> 00:26:37,920
is which ones where AI
might be most applicable.
500
00:26:38,580 --> 00:26:41,660
Strong preference for the
top three there, predictive
501
00:26:41,660 --> 00:26:44,180
analysis, network
performance monitoring, and
502
00:26:44,180 --> 00:26:47,500
network troubleshooting,
and root cause analysis.
503
00:26:47,500 --> 00:26:50,820
And if I look back,
comparing last year to
504
00:26:50,820 --> 00:26:53,080
this year, because we
asked the same question,
505
00:26:53,500 --> 00:26:58,900
there's a very consistent
showing of the top five
506
00:26:58,900 --> 00:27:01,840
listed there. Certainly
the top five is, or the
507
00:27:01,840 --> 00:27:03,580
top five each year, the
order has moved a little
508
00:27:03,580 --> 00:27:07,100
bit, but, you know,
statistically maybe not at all.
509
00:27:07,100 --> 00:27:10,340
So operators know, they
seem to know where they
510
00:27:10,340 --> 00:27:12,840
want to apply AI, whether
they've done it or not.
511
00:27:14,230 --> 00:27:16,080
So that's encouraging.
It seems to be a
512
00:27:16,080 --> 00:27:18,380
strategy there. But Omar,
thoughts, comments on
513
00:27:18,380 --> 00:27:20,600
this, and then we'll
dig into these cases.
514
00:27:21,100 --> 00:27:23,180
Yeah, sure. I think
there's an interesting
515
00:27:23,180 --> 00:27:25,720
difference between the
first two and the third.
516
00:27:26,230 --> 00:27:28,680
So the first two is stuff
you can do today, right?
517
00:27:28,680 --> 00:27:30,960
I think there's lots of tool
sets out there for doing
518
00:27:30,960 --> 00:27:33,760
these kinds of things. I
think there is a comfort
519
00:27:33,760 --> 00:27:36,940
level with operators, with
the tools and technology,
520
00:27:37,680 --> 00:27:40,580
and AI is a very
linear way to have
521
00:27:40,580 --> 00:27:43,060
maybe better analytics,
better insights,
522
00:27:43,120 --> 00:27:44,820
being able to get
more proactive.
523
00:27:45,880 --> 00:27:48,860
Root cause analysis is kind
of the opposite, right?
524
00:27:48,860 --> 00:27:51,940
When I talk to our
customers, it's really the
525
00:27:51,940 --> 00:27:57,940
holy grail of AI use cases,
But I think it's also
526
00:27:57,940 --> 00:28:02,240
the most complex and maybe
the least mature of the
527
00:28:02,240 --> 00:28:03,740
things out there because
it requires a lot of
528
00:28:03,740 --> 00:28:06,620
knowledge that's specific to
that environment and those
529
00:28:06,620 --> 00:28:09,060
operations, those kinds
of things. But I think as
530
00:28:09,060 --> 00:28:11,900
we start to crack that,
it has the potential to
531
00:28:11,900 --> 00:28:13,700
have the greatest upside
in terms of productivity,
532
00:28:13,920 --> 00:28:15,580
customer experience,
those kinds of things.
533
00:28:16,300 --> 00:28:20,120
All right. Great. Let me
just move things along.
534
00:28:20,120 --> 00:28:23,420
I know each of you has a
use case, so let me tee
535
00:28:23,420 --> 00:28:26,120
those up for you. Omar,
I believe you're first.
536
00:28:26,400 --> 00:28:28,460
And just for the audio,
what we wanted to do is,
537
00:28:28,460 --> 00:28:30,580
you know, we have a bunch
of data, but where we can,
538
00:28:30,580 --> 00:28:33,580
we want to show things
are real. So we tried to
539
00:28:33,580 --> 00:28:36,320
insert some specific use
cases in there to, you know,
540
00:28:36,320 --> 00:28:39,980
kind of put reality to
these pie charts and bar
541
00:28:39,980 --> 00:28:42,820
charts. So, Omar, I'll
let you start with the one
542
00:28:42,820 --> 00:28:45,160
you picked, and then Marie,
we'll let you go next.
543
00:28:45,460 --> 00:28:48,560
Sure. So we've been
building an agentic platform
544
00:28:48,560 --> 00:28:51,680
for a couple of years
now, and Config Drift is
545
00:28:51,680 --> 00:28:54,380
one of the first agents
we are releasing shortly.
546
00:28:55,200 --> 00:28:58,540
And it's interesting in
that it's designed to find
547
00:28:58,540 --> 00:29:01,060
compliance with golden
configs for your devices.
548
00:29:01,260 --> 00:29:03,560
And typically the
solutions that are
549
00:29:03,560 --> 00:29:05,560
available today, including
ones that we sell,
550
00:29:05,700 --> 00:29:10,800
really require you having a
list of your golden configs
551
00:29:10,800 --> 00:29:14,020
and being able to do
comparisons with what's running
552
00:29:14,020 --> 00:29:16,560
on the device. This takes
a very AI approach, and
553
00:29:16,560 --> 00:29:20,800
in terms of it uses more a
human intuitive process to
554
00:29:21,090 --> 00:29:24,000
find variances. So if I
show you a picture of, like,
555
00:29:24,000 --> 00:29:26,400
three dogs and a cat on a
slide, your mind automatically
556
00:29:26,400 --> 00:29:29,480
is, oh, the cat's, you
know, the thing that doesn't
557
00:29:29,480 --> 00:29:31,800
belong here without any
other prompting from you.
558
00:29:31,820 --> 00:29:34,120
And essentially, this is
what Config Drift does.
559
00:29:34,120 --> 00:29:36,820
It'll go through all
your router configs or
560
00:29:36,820 --> 00:29:39,880
device configs. It'll
train a small onboard model
561
00:29:39,880 --> 00:29:42,420
on those so it starts
to understand, okay,
562
00:29:42,420 --> 00:29:47,040
what's normal, what's normal
to vary, what's normal
563
00:29:47,040 --> 00:29:48,980
to be the same. And
then it pulls out the
564
00:29:48,980 --> 00:29:51,400
things that are varying
that probably shouldn't.
565
00:29:51,540 --> 00:29:53,740
and it generates a report
for you. So the nice
566
00:29:53,740 --> 00:29:56,220
thing is there's no prep
work. There's no training.
567
00:29:56,410 --> 00:29:58,560
You can just kind of
run this out of the
568
00:29:58,560 --> 00:30:01,240
box to find issues in
your configurations.
569
00:30:02,360 --> 00:30:03,360
All right.
570
00:30:03,720 --> 00:30:07,040
Excellent. And Marie, the one
571
00:30:07,040 --> 00:30:08,420
that you wanted to highlight.
572
00:30:08,700 --> 00:30:12,060
Yeah, well, I just wanted
to set it up first and just
573
00:30:12,060 --> 00:30:16,220
talk about kind of the
ways that users consume AI.
574
00:30:16,400 --> 00:30:19,820
You know, there's two kind
of personalities, I call
575
00:30:19,820 --> 00:30:22,560
it here. One is that ad hoc
query that we're all familiar
576
00:30:22,560 --> 00:30:24,660
with because we have it
on our devices now, et
577
00:30:24,660 --> 00:30:28,020
cetera, where you ask a
question, you get a result. And
578
00:30:28,020 --> 00:30:31,000
then there's continuously
running AI-driven insights.
579
00:30:31,000 --> 00:30:34,240
And so these aren't mutually
exclusive by any means,
580
00:30:34,240 --> 00:30:37,360
but I just wanted to bring
it back to the results of the
581
00:30:37,360 --> 00:30:40,160
survey here, the top one,
predictive analytics and
582
00:30:40,160 --> 00:30:44,970
network health analysis, that
was the top use case selected
583
00:30:45,700 --> 00:30:48,680
by the survey respondents,
right? like that that
584
00:30:48,680 --> 00:30:52,120
lends itself more towards
those continuously um running
585
00:30:52,120 --> 00:30:55,590
ai driven insights because
it's about prediction right
586
00:30:55,590 --> 00:31:00,320
really useful for uh for
planning teams so uh it's
587
00:31:00,320 --> 00:31:03,040
just important to keep that
in mind and make sure that
588
00:31:03,040 --> 00:31:06,280
you know at the the ai tools
that you have support both
589
00:31:06,420 --> 00:31:09,560
these um these type of uh
personalities i'll call
590
00:31:09,560 --> 00:31:12,760
it now though the the use
case that we're seeing most
591
00:31:12,760 --> 00:31:15,560
prevalent um albeit that
predictive analytics was
592
00:31:15,560 --> 00:31:19,600
the top we actually see
customers looking for ai for
593
00:31:19,600 --> 00:31:22,720
that those troubleshooting
uh scenarios right because
594
00:31:22,720 --> 00:31:26,060
that's the um that that's
the highest priority
595
00:31:26,060 --> 00:31:28,320
you know if there's an
outage if there's if there's
596
00:31:28,320 --> 00:31:30,900
issues on their network uh
they don't want their uh
597
00:31:30,900 --> 00:31:35,480
their their name their name
in the paper so so they're
598
00:31:35,480 --> 00:31:38,740
really looking for help
there and what i've what
599
00:31:38,740 --> 00:31:42,440
i'm showing here is the
reactive i would call it
600
00:31:42,440 --> 00:31:45,360
reactive assurance or
network troubleshooting um
601
00:31:45,360 --> 00:31:48,620
scenario where if there's
something in contact you know
602
00:31:48,620 --> 00:31:51,120
there's there's an alert
there's some issue and you
603
00:31:51,120 --> 00:31:54,120
can use uh that ai assistant
that's what we call
604
00:31:54,120 --> 00:31:57,020
it in terms of an ad hoc
theory or it can be alerted
605
00:31:57,020 --> 00:31:59,860
it doesn't really have to
be initiated by the user
606
00:32:00,520 --> 00:32:01,400
automated.
607
00:32:01,840 --> 00:32:04,480
And that AI system really helps
608
00:32:04,480 --> 00:32:06,700
you diagnose the issue first,
609
00:32:07,300 --> 00:32:08,740
troubleshoot it, and then
610
00:32:08,740 --> 00:32:11,060
provides recommended
resolution.
611
00:32:11,460 --> 00:32:13,660
And this is all spelled
out. And again, back to
612
00:32:13,660 --> 00:32:16,660
building trust, we kind of
show a step-by-step sequence
613
00:32:16,660 --> 00:32:19,560
of what the issues are,
what you can do to resolve
614
00:32:19,560 --> 00:32:21,660
the issue. But all this
can be automated further
615
00:32:21,660 --> 00:32:25,800
down the road. again so
it's just we're kind of um
616
00:32:25,800 --> 00:32:31,660
there's really a or a time
frame where users need to
617
00:32:31,750 --> 00:32:34,560
get more and more familiar
with the capabilities
618
00:32:34,560 --> 00:32:37,500
be able to trust the
ai engines and um in
619
00:32:37,500 --> 00:32:40,240
order to then get to the
next step of automation
620
00:32:40,240 --> 00:32:44,340
at autonomous level
that we talked about
621
00:32:46,680 --> 00:32:51,700
yeah good good point on
um you know we we think of
622
00:32:51,700 --> 00:32:54,320
things and i think we asked
in this context but what
623
00:32:54,320 --> 00:32:57,940
what kind of bubbles up a
lot is is automation for
624
00:32:57,940 --> 00:33:00,440
reliability and network
uptime and kind of the flip
625
00:33:00,440 --> 00:33:03,820
side that is avoiding your
name in the news i think
626
00:33:03,820 --> 00:33:06,960
uh to to your point uh is
is the negative side of
627
00:33:06,960 --> 00:33:10,380
that and uh we're certainly
seeing that uh quite
628
00:33:10,380 --> 00:33:14,400
quite a bit automation
clearly is being eyed to um to
629
00:33:14,400 --> 00:33:17,220
both on the benefit side
and to avoid the very very
630
00:33:17,500 --> 00:33:21,220
strong uh negative uh
that could really do
631
00:33:21,220 --> 00:33:25,320
damage to a brand um yeah
and this this one here
632
00:33:25,600 --> 00:33:30,380
hits on that point um
maria i'll let you um uh
633
00:33:30,650 --> 00:33:33,540
make some observations
here but you know the the
634
00:33:33,540 --> 00:33:35,560
most significant benefits
of ai for transport
635
00:33:35,560 --> 00:33:40,540
reducing um reducing
human error and and faster
636
00:33:40,540 --> 00:33:43,280
troubleshoot it's it's
exactly it those two at the
637
00:33:43,280 --> 00:33:46,980
top um i don't know maybe
we've already covered
638
00:33:46,980 --> 00:33:48,660
it but in any comments
you want to make here
639
00:33:48,660 --> 00:33:50,920
certainly consistent with
what you just described
640
00:33:51,200 --> 00:33:54,660
yeah yeah exactly so they
want operators they want
641
00:33:54,660 --> 00:33:58,000
to use ai to avoid
errors in the first place
642
00:33:58,000 --> 00:34:01,360
but if they do happen
um you know to do that
643
00:34:01,360 --> 00:34:05,070
troubleshooting all the
quicker uh smarter and And the
644
00:34:05,070 --> 00:34:08,380
only thing I wanted to
point out here is that,
645
00:34:08,380 --> 00:34:11,080
like, the OPEX savings
versus CapEx savings, like,
646
00:34:11,080 --> 00:34:13,780
we certainly do see an
emphasis on OPEX savings
647
00:34:13,780 --> 00:34:17,380
that really want to use,
you know, our customers
648
00:34:17,380 --> 00:34:21,400
are looking to use AI to
save operational costs.
649
00:34:22,540 --> 00:34:25,900
And that domino effect on
customer satisfaction, et
650
00:34:25,900 --> 00:34:28,080
cetera, it's not quite as
quantifiable right now.
651
00:34:29,880 --> 00:34:32,000
It's not that much of a focus.
652
00:34:32,260 --> 00:34:36,080
So you are seeing
today a strong focus
653
00:34:36,080 --> 00:34:38,520
on the OPEX savings
from AI? Was that?
654
00:34:38,900 --> 00:34:40,220
Absolutely. Yeah.
655
00:34:40,840 --> 00:34:44,660
Okay. Yeah, that's interesting
because I don't know
656
00:34:44,660 --> 00:34:46,600
if it was last year, but
a different survey we
657
00:34:46,600 --> 00:34:50,220
did. We actually did poll
operators on how much OPEX
658
00:34:50,220 --> 00:34:52,760
savings they're getting
from AI and automation.
659
00:34:52,760 --> 00:34:56,740
And it was surprisingly
low, at least currently.
660
00:34:57,240 --> 00:35:00,880
um so it's it's good to
see that here it's it's
661
00:35:00,880 --> 00:35:03,020
certainly a goal and
what you're seeing with
662
00:35:03,020 --> 00:35:06,620
your own um your own
discussions and work with
663
00:35:06,620 --> 00:35:11,300
customers but um i i
don't i don't know if you
664
00:35:12,220 --> 00:35:14,920
it's always a challenge
to quantify the savings
665
00:35:14,920 --> 00:35:17,300
that's that's what
that's what we do see in
666
00:35:17,300 --> 00:35:21,400
real life that yeah
that's absolutely right
667
00:35:21,460 --> 00:35:24,100
uh let's hit the
challenges and then move to
668
00:35:24,100 --> 00:35:27,120
the last section on
Agentica. Actually, we're
669
00:35:27,120 --> 00:35:29,820
doing pretty good on
time. So, good job, guys.
670
00:35:31,120 --> 00:35:33,300
Adoption challenges. Omar,
I'll let you kick this
671
00:35:33,300 --> 00:35:34,880
off. I mean, there's a
whole bunch of interesting
672
00:35:34,880 --> 00:35:37,700
stuff in here. The main
challenge is integrating with
673
00:35:37,700 --> 00:35:40,960
existing OSS, BSS management
is the biggest one.
674
00:35:41,480 --> 00:35:44,820
Data privacy and compliance and
675
00:35:44,820 --> 00:35:47,960
maturity are also quite high.
676
00:35:48,860 --> 00:35:51,220
And they do vary. I
don't have a data point
677
00:35:51,220 --> 00:35:53,020
for it, but they did
vary quite a bit by
678
00:35:53,020 --> 00:35:56,460
size of operator tier
one. versus Tier 2, 3s.
679
00:35:56,460 --> 00:36:00,060
The Tier 2, 3s, lack of
in-house resources is
680
00:36:00,060 --> 00:36:03,180
their main barrier,
which could explain why
681
00:36:03,180 --> 00:36:06,260
they're not as far along
as maybe they could be.
682
00:36:06,620 --> 00:36:07,580
Omar,
683
00:36:08,100 --> 00:36:12,920
thoughts on these results,
and are there ones that
684
00:36:13,660 --> 00:36:16,960
aren't here but you're seeing
in your own interactions?
685
00:36:17,580 --> 00:36:20,160
Yeah, certainly
integration is the big one,
686
00:36:20,160 --> 00:36:21,720
and it's not just
integration of the tool
687
00:36:21,720 --> 00:36:25,040
sets that customers
already have, but also the
688
00:36:25,040 --> 00:36:27,200
MOPS, the processes,
those kinds of things.
689
00:36:27,320 --> 00:36:29,820
I think the
advantage of regular
690
00:36:29,820 --> 00:36:32,120
automation is just, you know,
691
00:36:32,740 --> 00:36:35,080
automating existing MOPS.
And a lot of times what
692
00:36:35,080 --> 00:36:37,380
we're doing with AI, as an
example, and we just used,
693
00:36:37,640 --> 00:36:40,140
you know, you're changing
how you go about doing
694
00:36:40,140 --> 00:36:44,100
things and customers need
to kind of figure out
695
00:36:44,100 --> 00:36:46,680
how they want to do that
in their own environment.
696
00:36:46,920 --> 00:36:50,420
Certainly data privacy and
where data goes is a huge
697
00:36:50,420 --> 00:36:54,320
concern. A lot of our
CSPs are very much on-prem
698
00:36:54,320 --> 00:36:58,720
operations, so how they handle
data and using things like
699
00:36:58,720 --> 00:37:01,790
public LLMs and stuff is a
big topic of conversation.
700
00:37:03,520 --> 00:37:07,300
I think the other piece
for them that maybe
701
00:37:07,300 --> 00:37:09,540
is not on here is things
like understanding
702
00:37:09,540 --> 00:37:12,530
when AI is mature enough
to put into production.
703
00:37:13,220 --> 00:37:15,720
i mean ai's probabilistic
that's always
704
00:37:15,720 --> 00:37:17,480
going to be wrong some
amount of the time
705
00:37:17,900 --> 00:37:21,600
and and customers need
to develop criteria okay
706
00:37:21,600 --> 00:37:24,380
how good is good enough
and and how do i handle it
707
00:37:24,380 --> 00:37:26,740
when it's wrong because
those are going to happen
708
00:37:28,980 --> 00:37:32,900
yep good good points
um yeah let's um
709
00:37:32,900 --> 00:37:36,440
marie any any quick
comments here or um
710
00:37:38,470 --> 00:37:40,520
yeah i did want to
talk just to the
711
00:37:40,520 --> 00:37:43,080
integration and just
that case because
712
00:37:43,280 --> 00:37:45,080
It is really important, like
713
00:37:45,080 --> 00:37:47,320
these AI use cases, obviously,
714
00:37:47,640 --> 00:37:51,920
are complex. They require
a real detailed level
715
00:37:51,920 --> 00:37:54,300
of network observability
and specialized knowledge
716
00:37:54,300 --> 00:37:56,120
of the network. So you
really need to have
717
00:37:56,120 --> 00:37:59,300
an AI solution, kind of
call it full stack, like
718
00:37:59,300 --> 00:38:01,880
from the device to the
control layer to OSS.
719
00:38:02,340 --> 00:38:07,500
And so important to have
a solution that has those
720
00:38:07,500 --> 00:38:10,520
standards-based interfaces.
I talked about in
721
00:38:10,520 --> 00:38:14,200
AA or MCP, also a model
context protocol when you're
722
00:38:14,200 --> 00:38:16,660
leveraging your use tools
when it comes to AI.
723
00:38:16,980 --> 00:38:21,020
So you really want
to have that seamless
724
00:38:21,020 --> 00:38:23,940
experience to address
use cases at the
725
00:38:23,940 --> 00:38:26,560
top level that our
customer is facing.
726
00:38:28,620 --> 00:38:29,600
Okay.
727
00:38:30,740 --> 00:38:31,720
Great.
728
00:38:31,730 --> 00:38:34,700
We'll move into the last
section and then we will
729
00:38:34,700 --> 00:38:37,740
have some time for questions.
So we do have some in,
730
00:38:37,740 --> 00:38:41,080
but now is a good time to
start asking, and we'll hit
731
00:38:41,080 --> 00:38:43,740
as many as we can on the
time at the end. So agentic
732
00:38:43,740 --> 00:38:47,420
AI, as I mentioned,
brand new for this year's
733
00:38:47,420 --> 00:38:50,380
transport survey. So Omar,
I'll let you – I know you've
734
00:38:50,380 --> 00:38:53,180
got two slides here to just
kind of quickly set the
735
00:38:53,180 --> 00:38:57,380
context and define agentic
AI for us. So go ahead.
736
00:38:57,840 --> 00:39:02,000
Sure. So we look at agentic
AI as pieces of software
737
00:39:02,000 --> 00:39:04,720
that can run out and do
things for us. So, you
738
00:39:04,720 --> 00:39:07,460
know, today, or yesterday,
if you wanted to find an
739
00:39:07,460 --> 00:39:09,300
answer to something, you
could use your own tools, you
740
00:39:09,300 --> 00:39:11,640
could find an expert,
and there's probably, you
741
00:39:11,640 --> 00:39:13,660
know, a big pile of data
in your organization that
742
00:39:13,660 --> 00:39:15,940
probably has information,
but you don't know what's
743
00:39:15,940 --> 00:39:18,860
there, it's hard to get it,
you know, get to the TL1
744
00:39:18,940 --> 00:39:21,340
interfaces and stuff that
Marie was talking about.
745
00:39:21,340 --> 00:39:25,340
But I think what agents
are very good at today is
746
00:39:25,340 --> 00:39:28,020
giving you better information.
You know, things like
747
00:39:28,020 --> 00:39:30,420
knowledge retrieval,
pulling operational status,
748
00:39:31,140 --> 00:39:33,220
doing analytics, those
kinds of things. Those are
749
00:39:33,220 --> 00:39:36,330
things that agents are very
capable of doing today, and
750
00:39:36,330 --> 00:39:39,260
a lot of folks have those
kinds of agents in use.
751
00:39:39,580 --> 00:39:43,060
The decision-making still
falls to your staff and
752
00:39:43,060 --> 00:39:45,980
yourself, but agents let
you make better, faster
753
00:39:45,980 --> 00:39:48,960
decisions by giving you better
information. where we're
754
00:39:48,960 --> 00:39:52,280
really trying to get to
as an industry is agents
755
00:39:52,280 --> 00:39:54,700
actually being able to give
you better answers so you
756
00:39:54,700 --> 00:39:57,380
can ask very uh where
today you still have to be
757
00:39:57,380 --> 00:40:00,840
very prescriptive saying
hey i want you know my top
758
00:40:00,840 --> 00:40:04,600
five most busiest links and
we can go you know we can
759
00:40:04,600 --> 00:40:07,240
very quickly pull that
pull that stuff up you know
760
00:40:07,240 --> 00:40:09,910
where we want to go is
much more broader more
761
00:40:09,910 --> 00:40:13,640
unbounded questions you can
ask like hey why is my video
762
00:40:13,640 --> 00:40:16,840
traffic to seattle so choppy
which is not necessarily
763
00:40:16,840 --> 00:40:18,900
clear what the cause is
or what the resolution
764
00:40:18,900 --> 00:40:21,980
would be and have agents be
able to, you know, figure
765
00:40:21,980 --> 00:40:24,140
out what the underlying
components of the question
766
00:40:24,140 --> 00:40:28,020
are, go find answers, do
reasoning analysis, those
767
00:40:28,020 --> 00:40:30,780
kinds of things, and not
just give you information,
768
00:40:30,780 --> 00:40:33,200
but give you recommendations
and give you answers.
769
00:40:33,240 --> 00:40:35,940
And some of the work
like semantic encoding
770
00:40:35,940 --> 00:40:38,200
and knowledge graphs
and just evolution, how
771
00:40:38,200 --> 00:40:40,720
we use LLMs, are really
the drives behind
772
00:40:40,720 --> 00:40:43,720
getting us to those kind
of operational models.
773
00:40:47,080 --> 00:40:50,800
all right uh you've got one
did you want to hit this
774
00:40:50,800 --> 00:40:53,780
slide yeah i'll hit it
real quick so one of one of
775
00:40:53,780 --> 00:40:57,600
the reasons you know why agents
versus process automation
776
00:40:57,600 --> 00:41:00,300
goes back to the you know
we talked about ai being
777
00:41:00,300 --> 00:41:03,340
probabilistic so it's kind
of fuzzy and what it knows
778
00:41:03,340 --> 00:41:05,940
and what it doesn't know
and one of the nice things
779
00:41:05,940 --> 00:41:09,240
is with regular process
automation it's very static
780
00:41:09,240 --> 00:41:12,100
it's very rule-based you
know if you say match this
781
00:41:12,100 --> 00:41:14,640
particular set of criteria
then it'll knows what to do
782
00:41:14,640 --> 00:41:17,800
and if it doesn't like for
an acl for example uh you
783
00:41:17,800 --> 00:41:19,780
know you'll get unpredictable
results we'll just
784
00:41:19,780 --> 00:41:23,180
ignore that uh things with
the nice thing with agents is
785
00:41:23,180 --> 00:41:27,200
they understand novel
situations better so if it runs
786
00:41:27,200 --> 00:41:30,340
against you know comes again
across a set of traffic
787
00:41:30,340 --> 00:41:33,720
it's not seen before but
it's similar to other
788
00:41:33,720 --> 00:41:36,480
traffic that you've trained
it on it knows what to do
789
00:41:36,480 --> 00:41:40,060
it'll make an attempt to you
know to block the traffic
790
00:41:40,060 --> 00:41:43,820
or apply qos policy or those
kind of things and learns
791
00:41:43,820 --> 00:41:46,120
and adapts. And this is
kind of the key advancement
792
00:41:46,780 --> 00:41:50,340
with using agents over
things folks are doing today.
793
00:41:51,020 --> 00:41:53,280
All right. So with
that, yeah, thanks
794
00:41:53,280 --> 00:41:55,360
for that context,
Omar. With that,
795
00:41:55,980 --> 00:42:00,090
the first question we'd
asked in the survey was,
796
00:42:01,440 --> 00:42:05,180
how important will agentic
AI be in a specific
797
00:42:05,180 --> 00:42:06,980
to your transport network
over the next three
798
00:42:06,980 --> 00:42:10,140
years? So you see nearly
half, 47%, expected to
799
00:42:10,140 --> 00:42:13,460
be either very important
or absolutely critical.
800
00:42:13,670 --> 00:42:16,360
We ask these types
of questions a lot.
801
00:42:16,360 --> 00:42:21,280
And when you get that level
of percentage in those
802
00:42:21,280 --> 00:42:25,120
higher categories, they're
very important or critical.
803
00:42:25,120 --> 00:42:29,680
You know, it's meaningful.
It means this is something
804
00:42:29,680 --> 00:42:32,880
that's clearly resonating
with the service providers.
805
00:42:32,880 --> 00:42:38,460
So encouraging there, but
still certainly very new.
806
00:42:39,720 --> 00:42:43,380
I think maybe, for
me, maybe a little
807
00:42:43,380 --> 00:42:45,140
bit more aggressive
than I'd expected.
808
00:42:45,280 --> 00:42:48,340
But Omar, and then
Marie, I'll let you
809
00:42:48,340 --> 00:42:51,140
comment as well. Omar,
is this how you expected
810
00:42:51,140 --> 00:42:53,360
this result to come
in, or did you think
811
00:42:53,360 --> 00:42:56,700
there would be more
in the critical or?
812
00:42:57,710 --> 00:43:01,300
No, this is about what we
predict, right? I mean, I
813
00:43:01,300 --> 00:43:04,380
think this becomes the
mechanism for OPEX reduction is
814
00:43:04,380 --> 00:43:07,240
the ability to take tasks
that your staff does and
815
00:43:07,240 --> 00:43:10,080
hand them off to agents and
you know also the agents
816
00:43:10,080 --> 00:43:12,880
making their staff more
effective and more productive
817
00:43:12,880 --> 00:43:15,200
with the things that they're
going you know that they
818
00:43:15,200 --> 00:43:18,620
are still doing so this kind
of lines up with uh with
819
00:43:18,620 --> 00:43:22,800
our world view okay sounds
good marie any any last
820
00:43:22,800 --> 00:43:26,020
comments on this one yeah i
i thought it was encouraging
821
00:43:26,020 --> 00:43:28,880
actually like in three
years more than that yeah
822
00:43:28,880 --> 00:43:33,340
so um they're moving quick
quickly relatively quickly
823
00:43:34,900 --> 00:43:37,340
that is an interesting
thing with the three-year
824
00:43:37,340 --> 00:43:42,380
timeline we give um timelines
for whatever reason are always
825
00:43:42,380 --> 00:43:46,240
more ambitious than reality
uh it tells us a direction
826
00:43:46,240 --> 00:43:50,500
i find but um when i see
three in a question i
827
00:43:50,500 --> 00:43:54,180
tend to in my mind instantly
translate it to a five-year
828
00:43:54,180 --> 00:43:57,280
it seems to be more realistic
so it tells us where
829
00:43:57,280 --> 00:44:01,160
they're going to get um i
but you're right they they
830
00:44:01,160 --> 00:44:04,300
tend to not get there as is
quickly um challenges always
831
00:44:04,300 --> 00:44:06,140
pop up and actually we're
going to get to challenges
832
00:44:06,140 --> 00:44:09,120
in a moment they they tend
to be a bit bigger than
833
00:44:09,120 --> 00:44:12,700
um especially with early
technologies like this then
834
00:44:13,060 --> 00:44:16,630
then maybe uh what what
operators are are anticipating
835
00:44:16,630 --> 00:44:21,620
um yeah so we asked and
then there's one last slide
836
00:44:21,620 --> 00:44:25,520
after this and then we'll
hit q a um you know which
837
00:44:25,520 --> 00:44:28,800
we asked which agentic ai
capabilities are they planning
838
00:44:28,800 --> 00:44:30,640
for the transport network
again within the next
839
00:44:30,640 --> 00:44:35,480
three years so kind of the
midterm expectation uh and
840
00:44:35,480 --> 00:44:39,540
again so this is not just ai
it's specific to agentic ai
841
00:44:39,540 --> 00:44:43,300
assisting with troubleshooting
uh live optimization
842
00:44:43,300 --> 00:44:46,880
network performance and even
the third one generating
843
00:44:46,880 --> 00:44:50,260
on-demand reports with
natural language i would say
844
00:44:50,260 --> 00:44:54,260
are all fairly high for
this type of question um and
845
00:44:54,260 --> 00:44:57,260
then you can see uh the
the last three kind of fade
846
00:44:57,260 --> 00:44:59,920
away from from there and this
was select all that apply
847
00:44:59,920 --> 00:45:04,900
so um you know certainly
not all of them resonated uh
848
00:45:04,900 --> 00:45:09,540
but these top three um omar
and then uh marie for this
849
00:45:09,540 --> 00:45:12,260
one will let you comment
is this what you'd expect
850
00:45:12,840 --> 00:45:15,660
yeah i mean almost everyone
i talk to once root cause
851
00:45:15,660 --> 00:45:17,960
analysis and troubleshooting
you know it is probably
852
00:45:17,960 --> 00:45:20,540
has the biggest bang for
the buck or easiest to
853
00:45:20,540 --> 00:45:23,880
justify roi for for for
customers because usually
854
00:45:23,880 --> 00:45:26,880
there's hard tech you know
hard financial costs associated
855
00:45:26,880 --> 00:45:29,780
with meantime to repair
in time to identify those
856
00:45:29,780 --> 00:45:33,060
kinds of things um but we
also see this is probably
857
00:45:33,060 --> 00:45:36,320
the hardest thing to pull
off well um it's kind
858
00:45:36,320 --> 00:45:38,200
of interesting that finding
documentation is the lowest
859
00:45:38,200 --> 00:45:40,480
on that since that's
something that's uh that's
860
00:45:40,480 --> 00:45:43,780
very doable today with rag
and those kinds of things
861
00:45:44,400 --> 00:45:50,700
right doable but maybe not
as much benefit in in the
862
00:45:50,700 --> 00:45:53,900
end so maybe there was
some prioritization uh ROI
863
00:45:53,900 --> 00:45:56,500
kind of in there. Yeah,
that caught my eye as well.
864
00:45:56,860 --> 00:45:58,740
Marie, any thoughts
on this one?
865
00:45:59,540 --> 00:46:03,920
Yeah, no, just kind of
echoing what you guys
866
00:46:03,920 --> 00:46:07,920
said already, that it's
really important about
867
00:46:08,200 --> 00:46:13,370
scalability as networks are
scaling like ever so much
868
00:46:13,640 --> 00:46:18,480
more. You really need to get
a handle on um troubleshooting
869
00:46:18,480 --> 00:46:22,220
and um and network assurance
so that that really
870
00:46:22,220 --> 00:46:25,350
lends itself to agentic ai
because we do that autonomous
871
00:46:25,350 --> 00:46:27,060
networking you've got
these agents that are doing
872
00:46:27,060 --> 00:46:32,230
recently and acting on their
own accord so um yeah yeah
873
00:46:36,000 --> 00:46:40,180
sorry as as i said there
is a very consistent theme
874
00:46:40,180 --> 00:46:43,440
across all of the data here
on on the troubleshooting
875
00:46:43,440 --> 00:46:46,480
and the performance and the
reliability of the network
876
00:46:46,480 --> 00:46:50,140
you can really see where
where transport is is
877
00:46:50,140 --> 00:46:54,700
focusing with all of these
um different but but very
878
00:46:54,700 --> 00:46:59,100
tightly related technologies
so i i found that consistency
879
00:46:59,140 --> 00:47:03,840
very um interesting and i
think encouraging um Um,
880
00:47:06,180 --> 00:47:07,920
yeah, with that,
let me go to the,
881
00:47:07,920 --> 00:47:10,990
this is the last
one, uh, we had, um,
882
00:47:11,500 --> 00:47:15,980
and this is concerns
around agentic AI and, uh,
883
00:47:15,980 --> 00:47:18,560
for, for transport networks,
again, specifically,
884
00:47:18,560 --> 00:47:21,040
you know, this one
was, uh, Tim, to me,
885
00:47:21,040 --> 00:47:22,200
interesting as an
analyst, we always
886
00:47:22,200 --> 00:47:23,860
like to look at what
are the challenges.
887
00:47:24,600 --> 00:47:28,080
Um, and, and I thought,
you know, again, to, to my
888
00:47:28,080 --> 00:47:29,880
point I just made about
improving accuracy and
889
00:47:29,880 --> 00:47:33,200
reliability driving a lot
of what you know the the
890
00:47:33,200 --> 00:47:36,680
drivers for automation and
ai and yet when you move
891
00:47:36,680 --> 00:47:40,060
to agentic ai the accuracy
and reliability becoming
892
00:47:40,060 --> 00:47:45,020
a top barrier to doing
that is an interesting i
893
00:47:45,020 --> 00:47:48,280
think dilemma for the operators
and for the technology
894
00:47:48,280 --> 00:47:51,660
um the data privacy
and security you know i
895
00:47:51,660 --> 00:47:54,080
wasn't surprised by that
that's everywhere to do with
896
00:47:54,080 --> 00:47:57,490
automation and ai you
know so transport to that.
897
00:47:58,860 --> 00:48:02,980
But yeah, this accuracy
and reliability.
898
00:48:04,280 --> 00:48:07,660
Marie, I'll let you
comment on this.
899
00:48:08,240 --> 00:48:12,400
And one thing I would like
to delve into a little bit
900
00:48:12,400 --> 00:48:15,960
with you and with Omar
is as vendors, how do you
901
00:48:15,960 --> 00:48:21,700
address that? Are these
concerns coming up? And are you
902
00:48:21,700 --> 00:48:26,480
as vendors able to um to to
address these to to a degree
903
00:48:27,810 --> 00:48:32,020
yeah well on that point
accuracy and reliability
904
00:48:32,020 --> 00:48:35,640
like it really all comes
down to starting from a um a
905
00:48:35,640 --> 00:48:38,520
solid foundation like clean
network data is really
906
00:48:38,520 --> 00:48:43,250
important so that will then
yield accurate reliable
907
00:48:43,250 --> 00:48:47,880
results so what we always you
know um point our customers
908
00:48:47,880 --> 00:48:51,080
to is is clean up your your
network you know we have
909
00:48:51,080 --> 00:48:53,780
network audits etc and you
have to spend a fair amount
910
00:48:53,780 --> 00:48:56,880
of time in doing that to
get that solid foundation
911
00:48:57,300 --> 00:49:00,600
and then also circling back
to network digital twin
912
00:49:00,600 --> 00:49:04,180
right that's another way to
get that reliability is use
913
00:49:04,180 --> 00:49:08,680
use that network digital
twin capability to test that
914
00:49:08,680 --> 00:49:15,640
to validate those recommended
ai solutions before before
915
00:49:15,640 --> 00:49:19,800
you apply them so uh those
are two ways that we're
916
00:49:19,800 --> 00:49:23,000
addressing kind of that accuracy
and reliability challenge
917
00:49:23,000 --> 00:49:25,610
the other thing in terms
of data privacy concerns
918
00:49:26,140 --> 00:49:30,380
the the second highest
challenge there is really
919
00:49:30,380 --> 00:49:33,000
enabling customers to use
their own private cloud
920
00:49:33,260 --> 00:49:35,340
so when people talk about
cloud they typically
921
00:49:35,340 --> 00:49:38,280
think of public cloud
infrastructure for llms
922
00:49:38,280 --> 00:49:41,200
for ai but more and
more especially the tier
923
00:49:41,200 --> 00:49:44,440
ones they're they're
really want to you know
924
00:49:44,600 --> 00:49:47,420
put some guardrails around
put some firewalls around
925
00:49:47,420 --> 00:49:50,300
it and move that technology
to the private cloud maybe
926
00:49:50,300 --> 00:49:54,900
use slm small language
models etc so that that's one
927
00:49:54,900 --> 00:49:58,200
way to address the data
privacy and security concerns
928
00:49:59,000 --> 00:50:02,260
yeah we actually we don't
have it in this presentation
929
00:50:02,260 --> 00:50:05,680
but um there was a separate
question around uh types
930
00:50:05,680 --> 00:50:09,800
of uh approach to um to
to uh you know where you
931
00:50:09,800 --> 00:50:12,750
want to house the data private
versus public versus hybrid
932
00:50:12,750 --> 00:50:15,760
small language model do you
want to make any comments
933
00:50:15,760 --> 00:50:23,460
on on on that particular
result uh yeah i um i now
934
00:50:23,460 --> 00:50:25,980
forget what the answer was
from that survey question
935
00:50:25,980 --> 00:50:29,640
i think that yeah well the
private did did come in
936
00:50:29,640 --> 00:50:32,760
quite strongly um there was
also a lot of interest in
937
00:50:32,760 --> 00:50:37,160
in uh in hybrid uh which
you know maybe is a bit of a
938
00:50:37,160 --> 00:50:40,720
hedge public was right was
minimal small average model
939
00:50:40,720 --> 00:50:43,880
was actually quite minimal
too yeah go ahead yeah
940
00:50:43,880 --> 00:50:49,640
yeah so it's again this
requires investment uh to uh to
941
00:50:49,640 --> 00:50:52,640
uh house things in in a
private cloud infrastructure
942
00:50:52,690 --> 00:50:55,800
uh more investment up
front anyways than the
943
00:50:55,800 --> 00:50:58,340
public cloud infrastructure
but we yeah we are
944
00:50:58,630 --> 00:51:03,200
in our discussions we're
seeing that um that uh move
945
00:51:03,200 --> 00:51:07,500
towards hybrid like you use
the private cloud in some
946
00:51:07,500 --> 00:51:10,460
cases public cloud and
others and in fact you can
947
00:51:10,460 --> 00:51:13,320
you can have uh one as a
backup so to speak a backup
948
00:51:13,320 --> 00:51:17,300
of of the other in case
there's some issues with
949
00:51:17,300 --> 00:51:22,000
connectivity or access so
yeah it's uh it really nothing
950
00:51:22,000 --> 00:51:25,120
is coming out to the fore
as a dominant right now
951
00:51:25,120 --> 00:51:29,120
again everyone's exploring
and uh really at the early
952
00:51:29,120 --> 00:51:34,080
stages yeah that's it and
you do kind of find that
953
00:51:34,080 --> 00:51:36,820
and when technologies are
new it's it's kind of a
954
00:51:36,820 --> 00:51:39,940
buckshot approach so it it
and sometimes that's the way
955
00:51:39,940 --> 00:51:46,240
it goes sometimes the
industry you know finds a um
956
00:51:46,700 --> 00:51:50,080
finds it you know makes a
decision and moves to that so
957
00:51:50,080 --> 00:51:53,980
i would probably say it's
too early to tell we're
958
00:51:53,980 --> 00:51:56,320
used to so the small language
mod i'm sorry i put you
959
00:51:56,320 --> 00:51:59,700
on that i realize we don't
have the data in front of
960
00:51:59,700 --> 00:52:03,700
us but i know the small
language model is low um do
961
00:52:03,700 --> 00:52:06,640
you feel operators are kind
of missing an opportunity
962
00:52:06,640 --> 00:52:09,470
for small language model
or would you expect that
963
00:52:09,470 --> 00:52:14,420
that would be you know more
minority niche i think it's
964
00:52:14,420 --> 00:52:17,320
still evolving uh what
we're seeing currently like
965
00:52:17,320 --> 00:52:20,840
right now is just really
leveraging that um the reg
966
00:52:21,000 --> 00:52:23,440
technology like retrieval
augmented generation
967
00:52:23,440 --> 00:52:26,960
right where you you actually
tap into this specific
968
00:52:26,960 --> 00:52:30,620
knowledge base applicable
to your network before
969
00:52:30,620 --> 00:52:36,360
then using a general llm
um so that is working
970
00:52:36,600 --> 00:52:41,040
working well right now um
i guess i'd say uh in in
971
00:52:41,040 --> 00:52:44,960
the interim so uh maybe
that'll then move to the
972
00:52:44,960 --> 00:52:49,860
small language model later
on okay um omar we'll
973
00:52:49,860 --> 00:52:51,740
let you have the final
words we do have some good
974
00:52:51,740 --> 00:52:55,220
questions in so i want
to get to those um on the
975
00:52:55,220 --> 00:52:57,980
challenges and maybe
really kind of homing in
976
00:52:57,980 --> 00:53:03,120
specifically on how cisco as
a major supplier is helping
977
00:53:03,120 --> 00:53:07,260
to address some of these
around agentic ai sure so
978
00:53:07,260 --> 00:53:09,760
uh you know something i
touched on earlier i think
979
00:53:09,760 --> 00:53:12,520
accuracy and reliability
a big piece of that sits
980
00:53:12,520 --> 00:53:15,040
with customers right they
need to understand what
981
00:53:15,380 --> 00:53:18,460
level model performance
is acceptable for them.
982
00:53:18,460 --> 00:53:22,480
So if a model has 90%
performance, 90% accuracy,
983
00:53:22,500 --> 00:53:25,050
is that good enough to put
in production? But also,
984
00:53:25,620 --> 00:53:27,380
what processes do they put in
985
00:53:27,380 --> 00:53:29,000
place to handle the other 10%?
986
00:53:29,140 --> 00:53:32,500
If you have an employee
that's wrong 10% of
987
00:53:32,500 --> 00:53:36,500
the time, you have
mechanisms to deal with
988
00:53:36,500 --> 00:53:38,620
that. And you need
similar things for the
989
00:53:38,620 --> 00:53:41,190
agents, the technologies
that you deploy.
990
00:53:41,500 --> 00:53:46,000
From our perspective, from
a feature perspective,
991
00:53:46,000 --> 00:53:49,780
We try and put confidence
intervals on any output
992
00:53:49,780 --> 00:53:52,900
that we provide back to a
system or back to a user
993
00:53:52,900 --> 00:53:54,760
in terms of, okay, this
is the answer and we have
994
00:53:54,760 --> 00:53:57,900
this percent confidence
that this is a good answer
995
00:53:57,900 --> 00:54:00,220
and let the human decide
whether they're good
996
00:54:00,220 --> 00:54:02,010
with that or they want to
do some further research.
997
00:54:02,550 --> 00:54:04,300
Other than that, the
kind of usual things like
998
00:54:04,300 --> 00:54:07,420
transparency, things like
dry run capabilities, we
999
00:54:07,420 --> 00:54:09,920
talked about digital twin
down the road so it's less
1000
00:54:09,920 --> 00:54:11,830
of a roll of the dice from
a customer perspective
1001
00:54:12,280 --> 00:54:14,940
if they start deploying
some of these technologies.
1002
00:54:15,820 --> 00:54:20,060
All right. So safe to say
from your perspective,
1003
00:54:20,060 --> 00:54:22,500
yes, the challenge is
there, but they're all
1004
00:54:23,360 --> 00:54:25,860
addressable with the
vendors in combination. It
1005
00:54:25,860 --> 00:54:27,300
sounds like things that
the service providers
1006
00:54:27,300 --> 00:54:32,520
themselves also need to
do to address trust. Yeah.
1007
00:54:32,880 --> 00:54:37,720
Good stuff. Let's jump
into the Q&A. We've
1008
00:54:37,720 --> 00:54:40,000
got a few minutes left.
We won't get to them
1009
00:54:40,000 --> 00:54:44,360
all, but I know our
sponsors will be happy
1010
00:54:44,360 --> 00:54:48,580
to follow up one-on
-one on ones we miss.
1011
00:54:49,160 --> 00:54:52,100
You know, this one, Omar,
I'll start with you.
1012
00:54:55,660 --> 00:54:58,100
It relates to the
agentic AI, and it's
1013
00:54:58,100 --> 00:54:59,600
a question of interest
to me as well.
1014
00:54:59,600 --> 00:55:01,440
Question is, are agents always
1015
00:55:01,440 --> 00:55:04,760
based on large language models?
1016
00:55:04,960 --> 00:55:08,520
And really, for me, part
of it is teasing out, you
1017
00:55:08,520 --> 00:55:10,560
know, you have AI, you have
agentic AI, and you have
1018
00:55:10,560 --> 00:55:12,820
large language models,
and I think large language
1019
00:55:12,820 --> 00:55:17,060
models tends to get the
bulk of attention. I think
1020
00:55:17,060 --> 00:55:20,100
there's some confusion there.
So from your perspective,
1021
00:55:21,210 --> 00:55:23,680
the role of large
language models,
1022
00:55:23,680 --> 00:55:25,980
specifically with
this agentic AI?
1023
00:55:26,120 --> 00:55:27,720
Yeah, I'll give you an emphatic
1024
00:55:27,720 --> 00:55:30,220
no for a couple of reasons.
1025
00:55:30,220 --> 00:55:34,240
I think agents, you know,
AI is a bunch of different
1026
00:55:34,240 --> 00:55:37,380
technologies, just not
LLMs. And sometimes other
1027
00:55:37,380 --> 00:55:40,860
technologies are better
solutions than LLMs are. a lot
1028
00:55:40,860 --> 00:55:44,080
of traffic network traffic
analysis is time series
1029
00:55:44,080 --> 00:55:47,240
analysis um which is kind
of machine learning and we
1030
00:55:47,240 --> 00:55:50,720
have lots of proven high
accuracy solutions for handling
1031
00:55:50,720 --> 00:55:54,500
that so i think at some
point you take a uh you
1032
00:55:54,500 --> 00:55:57,180
know you take a traditional
technical analysis and
1033
00:55:57,180 --> 00:56:00,400
not just chase the bright
shiny and say okay lms are
1034
00:56:00,400 --> 00:56:04,460
cool but um you know we have
better ways maybe not as
1035
00:56:04,460 --> 00:56:07,340
**** but you know more
accurate more functional ways
1036
00:56:07,340 --> 00:56:11,240
of doing things i think
also based on you know tying
1037
00:56:11,240 --> 00:56:14,120
back to the conversation
you were just having you
1038
00:56:14,120 --> 00:56:16,020
know lm might be the better
solution but it's not
1039
00:56:16,020 --> 00:56:18,160
going to be an acceptable
solution from the security or
1040
00:56:18,160 --> 00:56:21,160
data privacy perspective
so then yeah maybe you do
1041
00:56:21,280 --> 00:56:24,100
use more on-prem technologies
like you know small
1042
00:56:24,100 --> 00:56:28,520
model run on prime or or
stick to uh heuristics or
1043
00:56:28,520 --> 00:56:32,260
algos or machine learning
instead i think over time a
1044
00:56:32,260 --> 00:56:35,580
lot of them will move to
lms for for other reasons
1045
00:56:35,580 --> 00:56:38,080
and their time series
foundation models based on lms
1046
00:56:38,080 --> 00:56:40,240
that are starting to become
available but certainly
1047
00:56:40,240 --> 00:56:44,820
today uh often that lms are
not the best uh you know
1048
00:56:44,820 --> 00:56:50,120
thing to start off with
okay excellent um let's uh
1049
00:56:50,120 --> 00:56:55,120
marie a couple couple came
in around digital twin uh
1050
00:56:55,120 --> 00:56:57,700
i'll let you start those
and i guess omar if you
1051
00:56:57,700 --> 00:57:02,300
wanted to chime in as well
but um the first one that
1052
00:57:02,300 --> 00:57:06,320
that came in was is the
network digital twin uh fully
1053
00:57:06,320 --> 00:57:09,040
virtualized and commodity
hardware does it require
1054
00:57:09,340 --> 00:57:12,520
gpus and tpus so kind
of a hardware question
1055
00:57:12,520 --> 00:57:15,400
around uh digital twin
does sienna have any
1056
00:57:15,400 --> 00:57:21,660
perspective on that one
yeah it uh it can be just
1057
00:57:21,660 --> 00:57:23,840
virtualized on commodity
hardware it doesn't
1058
00:57:23,840 --> 00:57:27,120
require like the the
specialized uh processing
1059
00:57:27,120 --> 00:57:30,980
that that we uh now
know and expect from uh
1060
00:57:31,100 --> 00:57:34,100
from gen ai tools from
large language models like
1061
00:57:34,100 --> 00:57:39,160
gb so so certainly um
as as it i guess i'd say
1062
00:57:39,160 --> 00:57:42,060
it's all in the implementation
whether or not it's
1063
00:57:42,060 --> 00:57:46,700
actually fully integrated
within uh within uh um
1064
00:57:47,380 --> 00:57:50,040
agentic AI framework,
I'd say, within the
1065
00:57:50,040 --> 00:57:54,180
model itself. But for
now, yeah, certainly it
1066
00:57:54,180 --> 00:57:57,440
doesn't require
specialized GPUs or QPUs.
1067
00:57:58,180 --> 00:58:01,040
Okay. And then the other one,
1068
00:58:01,140 --> 00:58:03,560
which is part of a larger
1069
00:58:04,040 --> 00:58:08,780
trend, and it came out in
the survey as well, just
1070
00:58:08,780 --> 00:58:12,260
generally about the legacy
network and integration with
1071
00:58:12,260 --> 00:58:15,560
existing. This question
here was specific to Digital
1072
00:58:15,560 --> 00:58:19,700
Twin, and it was actually
asked as well more generally,
1073
00:58:19,700 --> 00:58:23,240
but maybe on the Digital
Twin part. For Digital Twin,
1074
00:58:23,500 --> 00:58:26,160
it looks like all network
elements should support SDN.
1075
00:58:26,180 --> 00:58:30,260
How do you deal with the
legacy network elements in
1076
00:58:30,260 --> 00:58:33,920
trying to build Digital
Twin? And maybe that's one
1077
00:58:33,920 --> 00:58:38,480
of the biggest challenges
to adoption, why it isn't
1078
00:58:38,480 --> 00:58:42,960
happening super quick but yeah
any there yeah it certainly
1079
00:58:42,960 --> 00:58:45,420
is a challenge because those
legacy network elements
1080
00:58:45,420 --> 00:58:48,580
you know they did just
not as programmable like
1081
00:58:48,580 --> 00:58:52,680
not only do they not share
a telemetry northbound in as
1082
00:58:52,680 --> 00:58:56,560
uh i'd say smooth and in
real time a manner as as
1083
00:58:56,560 --> 00:59:00,080
modern ones but um they
they also aren't quite as
1084
00:59:00,080 --> 00:59:03,660
programmable so certainly there
there's um you know specialized
1085
00:59:03,660 --> 00:59:06,500
techniques that you can
use to circumvent that
1086
00:59:06,500 --> 00:59:10,380
and to get a to get an
accurate view of the network
1087
00:59:10,380 --> 00:59:12,760
again it's not all about
real-time streaming you can
1088
00:59:12,760 --> 00:59:16,260
have a network digital twin
that has very good capabilities
1089
00:59:16,260 --> 00:59:20,220
that operates on
snapshots of data that uh
1090
00:59:20,220 --> 00:59:24,840
snapshots on network data deltas
of what's changed etc that
1091
00:59:24,840 --> 00:59:28,200
will then be able to address
the use cases and you
1092
00:59:28,200 --> 00:59:31,080
can still run those very
valuable what-if scenarios.
1093
00:59:31,900 --> 00:59:34,520
So, yeah, certainly something
that has to be taken
1094
00:59:34,520 --> 00:59:37,160
into mind. It is
challenging, but it's doable.
1095
00:59:38,300 --> 00:59:39,920
All right. Excellent. We'll
1096
00:59:39,920 --> 00:59:42,140
close out with
one last question.
1097
00:59:43,240 --> 00:59:46,820
Omar, I'll let you take it and
1098
00:59:46,820 --> 00:59:49,220
kind of in context, actually.
1099
00:59:50,460 --> 00:59:53,680
We had a question about
workforce impacts. And there
1100
00:59:53,680 --> 00:59:59,700
is some anxiety in the
market about impacts of AI
1101
00:59:59,700 --> 01:00:01,960
on workforce. It showed
up in the survey. There's
1102
01:00:01,960 --> 01:00:06,560
some, not dire concern, but
it certainly was a finding.
1103
01:00:06,560 --> 01:00:08,840
The question here is
somebody's wondering, how do
1104
01:00:08,840 --> 01:00:12,480
I AI-proof my career? Is
there things moving to AI?
1105
01:00:12,840 --> 01:00:17,000
The people on this
call, what do they do?
1106
01:00:17,000 --> 01:00:19,880
Omar, I'll let you
close out with that one.
1107
01:00:20,320 --> 01:00:22,080
I think the thing to do is
1108
01:00:22,080 --> 01:00:23,700
stay on top of the technology,
1109
01:00:23,780 --> 01:00:26,620
you know, find a
couple of blogs,
1110
01:00:26,620 --> 01:00:27,850
find a couple of
YouTube channels,
1111
01:00:28,480 --> 01:00:31,080
those kinds of things,
and just understand
1112
01:00:31,080 --> 01:00:33,320
and stay on top of
what's going on,
1113
01:00:33,520 --> 01:00:34,720
and then try and figure out
1114
01:00:34,720 --> 01:00:36,100
how it applies to your day job.
1115
01:00:36,640 --> 01:00:38,780
You know, I think the trick
for the folks who will
1116
01:00:38,780 --> 01:00:41,720
be successful is they can
look at what's, you know,
1117
01:00:41,820 --> 01:00:44,020
what the newest technologies
are and figure out how to
1118
01:00:44,020 --> 01:00:45,520
apply them or turn around
and say, you know what,
1119
01:00:45,520 --> 01:00:48,900
this is not great. this is
this will not work in our
1120
01:00:48,900 --> 01:00:51,540
environment for these reasons
and that's the kind of
1121
01:00:51,660 --> 01:00:53,960
insights that their
leadership will appreciate
1122
01:00:54,730 --> 01:00:58,420
okay uh yeah and
certainly one of the um
1123
01:00:58,640 --> 01:01:01,120
the findings throughout was
there there is certainly
1124
01:01:01,120 --> 01:01:04,600
um an interest in humans
still in in the loop you know
1125
01:01:04,600 --> 01:01:08,000
to that trust and reliability
point um with that
1126
01:01:08,000 --> 01:01:11,140
we're a little over so we'll
close out here um really
1127
01:01:11,140 --> 01:01:13,880
enjoyed working on this
project uh with you marie and
1128
01:01:13,880 --> 01:01:16,720
omar and cisco and and the
Ciena teams. Thanks a lot
1129
01:01:16,720 --> 01:01:19,660
for your insights on this
call and all the work in
1130
01:01:19,660 --> 01:01:22,660
putting this together. Thanks
to the audience for tuning
1131
01:01:22,660 --> 01:01:26,640
in and for your great
questions and look for the
1132
01:01:26,640 --> 01:01:28,820
white paper to come out over
the next couple of weeks.
1133
01:01:28,820 --> 01:01:30,200
Thanks a lot, everyone.
1134
01:01:30,740 --> 01:01:31,900
Thank you.
1135
01:01:33,860 --> 01:01:34,780
Thanks.