Privacy Please
A genuine and informative podcast about data privacy and security. Your reliable place for best practices, interviews, belly laughs, and real stories.
Privacy Please
Ep. 25 - Dr. Gabriela Zanfir-Fortuna, Senior Counsel Future of Privacy Forum & Bob Eckman, CISO, Kent State University
In this week's special, two-guest episode of Privacy Please, we invited on Dr. Gabriela Zanfir-Fortuna (Senior Counsel) and Bob Eckman (CISO) of Kent State University. We learn about each of their stories, and how they landed where they are today. We then dissect an article Gabriela wrote called ‘GDPR - Analysis, and Guidance for US Higher Education Institutions’. Then we discuss identifying data flows, retention schedules, who gets these assignments for organizations, guilty pleasures, what three questions Bob would ask his dog Toby if he could talk, and much, much more! https://fpf.org/wp-content/uploads/2020/05/FPF_GDPR-Report-Final.pdf
This was such a good episode even for anyone outside of universities because it really gets you thinking differently about data privacy in general. Please don't miss this one and don't forget to share, subscribe, and enjoy!
-Cam
1
00:00:04.440 --> 00:00:13.530
Cameron Ivey: Alright ladies and gentlemen. Welcome to privacy, please. I am your host Cameron IV and with me as always is Gabe gums today we have an awesome episode, we have two special guests.
2
00:00:13.559 --> 00:00:15.150
Cameron Ivey: On the first one is
3
00:00:15.360 --> 00:00:19.860
Cameron Ivey: Bob recommend the sea. So of Kent State University, Bob. Thanks for coming on. How you doing,
4
00:00:20.400 --> 00:00:22.230
reckman: I'm doing good. That's excellent recommend is
5
00:00:23.460 --> 00:00:25.920
Cameron Ivey: Oh my gosh, Robert Kirkman. Yep. You know what
6
00:00:26.130 --> 00:00:27.240
reckman: You want to start over.
7
00:00:27.510 --> 00:00:28.170
Cameron Ivey: Probably, yeah.
8
00:00:28.440 --> 00:00:29.280
Gabe Gumbs: Yeah yeah
9
00:00:29.550 --> 00:00:32.070
Cameron Ivey: Yeah, let's go ahead and just start over. I'll keep it recording, but
10
00:00:32.730 --> 00:00:33.180
Cameron Ivey: Alright.
11
00:00:33.240 --> 00:00:34.110
reckman: See you. There we go.
12
00:00:35.460 --> 00:00:35.850
reckman: We can
13
00:00:36.510 --> 00:00:39.030
Cameron Ivey: Yeah, I can. I can cut all this, but it
14
00:00:39.360 --> 00:00:42.090
Cameron Ivey: Will put it in the you know the whatever they
15
00:00:42.120 --> 00:00:42.450
Were Real
16
00:00:43.710 --> 00:00:47.670
Cameron Ivey: A real yeah I'll takes. Awesome. Alright. Well screw that up.
17
00:00:48.600 --> 00:00:49.290
reckman: Start over.
18
00:00:49.800 --> 00:00:50.370
Cameron Ivey: All right, here we go.
19
00:00:51.990 --> 00:01:05.310
Cameron Ivey: Ladies and gentlemen, welcome to privacy, please. I'm your host, Cameron IV and with me as always is gave gums. We have two very special guests on today. The first is Bob Ackman is the CIO of Kent State, Bob. Thanks for coming on.
20
00:01:05.700 --> 00:01:06.900
reckman: Thank you very much for having me.
21
00:01:06.960 --> 00:01:13.140
Cameron Ivey: Appreciate it. Yeah, absolutely. And if you want to just kind of give your background, real quick and then we'll get to the second guest.
22
00:01:13.590 --> 00:01:15.810
Cameron Ivey: Shares tell the listeners who you are and how you got to where you are.
23
00:01:16.140 --> 00:01:24.090
reckman: Yeah, I've been in security for probably better part of gosh over a decade now, at least started out cutting my teeth and nuclear power. So I was the
24
00:01:24.720 --> 00:01:29.370
reckman: Help to establish the cybersecurity program for nuclear power generation in Ohio and Pennsylvania.
25
00:01:30.270 --> 00:01:45.270
reckman: You know, that's really a quick way to get dropped in the deep end of security from there. I've done a tremendous amount of consulting worked with Fortune 100 companies all over Cleveland Northeast Ohio, helping them to implement security programs as a, you know, rent to Bob, if you will.
26
00:01:45.330 --> 00:01:47.040
Cameron Ivey: See, so for hire, and
27
00:01:47.340 --> 00:02:00.390
reckman: Opportunity can stay presented itself over a year ago and I've been gratefully and honored to be there. See, so sense. And I'm also a faculty member there as well. I teach in digital science going teach cybersecurity a digital system security courses as well.
28
00:02:01.200 --> 00:02:08.730
Cameron Ivey: That's awesome. Thanks for sharing that. And our second guest is Gabriella Sanford phone Tina for tuna to say that right
29
00:02:09.960 --> 00:02:12.000
Cameron Ivey: You guys thank you for tuna, but
30
00:02:12.180 --> 00:02:12.870
Gabriela: I'll take it.
31
00:02:13.560 --> 00:02:19.350
Cameron Ivey: Alright, that works. Thank you so much for coming on. She's the Senior Counsel future privacy forum.
32
00:02:20.910 --> 00:02:24.030
Cameron Ivey: Can you just give us your backstory, who you are and how you got to where you're
33
00:02:24.930 --> 00:02:31.440
Gabriela: On for sure. Hello, everyone, and thank you, Cameron, for having me as a guest today on your podcasts.
34
00:02:32.130 --> 00:02:45.150
Gabriela: Well, I am a specialist in European Union data prediction low in privacy law. I have what 10 years of experience in the field. Um, I started as a researcher in Romania at the University of Iowa.
35
00:02:45.780 --> 00:02:59.610
Gabriela: Where I completed an LLM in human rights, with a focus on data protection and then I completed my PhD in law, with a focus on the rights of the data subject as we call them in euro, meaning the individuals whose personal data.
36
00:03:00.780 --> 00:03:15.240
Gabriela: Are being processed and after that I moved to Brussels, I started to work for the European Data prediction supervisor, which is the EU data protection authority for all of the
37
00:03:15.270 --> 00:03:21.900
Gabriela: EU institutions invitees um but also the official Advisor of the European Union when your legislator.
38
00:03:22.500 --> 00:03:36.960
Gabriela: And this is how I got to be in Brussels in the most exciting of times. And that was the time when the GDPR was being negotiated and I got to be involved in the GDPR legislative file from
39
00:03:38.160 --> 00:04:03.000
Gabriela: As part of the team that advised the legislator from either. Yes. And in 2016 after all of that I'm finally came to an end and the GDPR was adopted, I moved to the United States, and here I am now. I've been working for the future privacy forum for close to four years now and I've always
40
00:04:04.350 --> 00:04:15.570
Gabriela: Kept you know close to what's been happening in the EU, and I've been trying to bring understanding about the GDPR in the EU data protection law here in the US.
41
00:04:16.260 --> 00:04:19.350
Cameron Ivey: Awesome. So let's go ahead and get started.
42
00:04:20.370 --> 00:04:27.180
Cameron Ivey: For everyone out there. What is GDPR. What does it mean, can you give us a just a simple background.
43
00:04:28.440 --> 00:04:42.870
Gabriela: Yes, the GDPR stands for the general data protection regulation and it's a legislative act of the European Union, which directly applies in all of the Member States of European Union.
44
00:04:43.470 --> 00:04:54.750
Gabriela: Um, prior to the GDPR we had another law of the EU or that was the previous data protection directive directive 95 446
45
00:04:55.620 --> 00:05:07.650
Gabriela: Which had many of the concepts with which the GDPR operates and many of the legal institutions. Many of the rules, you had the rights of the data subject as well.
46
00:05:08.430 --> 00:05:21.750
Gabriela: In the directive, you had rules for international data transfers, you have a very broad Google application as well. I'm only that know directive was adopted in 1995 and
47
00:05:22.260 --> 00:05:40.260
Gabriela: There was clearly a need to update that type of legislation for the internet age and for the big data age for the algorithm age and this is what happened with the GDPR which technically revamped the former directive.
48
00:05:41.370 --> 00:05:49.710
Gabriela: Which in its third. And actually, he's also a result of another legislative process started in the 70s and Member State level when
49
00:05:50.160 --> 00:06:08.100
Gabriela: The different European states that were part of the EU at that time felt that as computers start, you know, being able to store to store a lot of data in public administrations start to keep electronic files about people.
50
00:06:08.910 --> 00:06:28.890
Gabriela: Maybe there should be some rules in place to make sure that however this data is being collected stored process is fair towards the individuals and interviews individuals have some sort of rights to know who keeps data about them, for what purposes they have some sort of rights to
51
00:06:29.940 --> 00:06:43.020
Gabriela: Ask for correction of their data if they're public administration somehow mess things up or to even ask for eraser. So we have the right of Eurasia. In the first data protection laws in Europe from
52
00:06:43.890 --> 00:06:52.020
Gabriela: You know, the late 70s and 80s, maybe not the very, very first one in 1970 in Germany, but then
53
00:06:53.040 --> 00:07:18.990
Gabriela: Either appeared in the subsequent ones. So my point is that the GDPR is a huge step in a long evolution of laws that starting starting in Europe in the 70s in the focus on collecting and using data about individuals in automated ways
54
00:07:20.550 --> 00:07:31.770
Gabe Gumbs: So, so Dr. You are Senior Counsel over the future of privacy form. So, if you would, for our guests that are unfamiliar to tell us a little bit about the future privacy form.
55
00:07:32.160 --> 00:07:40.650
Gabe Gumbs: And also this report that you you pen. The general data protection regulation and analysis and guidance for US higher education institutions.
56
00:07:41.550 --> 00:07:46.590
Gabriela: Sure. So the future of privacy forum is a theme that
57
00:07:47.790 --> 00:08:03.300
Gabriela: Promotes data optimism and responsible data uses in the sense that I'm def def, and we are def def believe that personal data can be processed in responsible ways
58
00:08:04.050 --> 00:08:19.170
Gabriela: So we have stakeholders from regulators from industry and all sorts of branches of the industry from the big tech to automakers to startups sharing economy.
59
00:08:20.730 --> 00:08:41.340
Gabriela: We also have stakeholders from the education sphere. We have a very, very prominent student privacy work stream that's led by my colleague, Amelia bands, and I'm sure many of your stakeholders into your ears are familiar with her work and her team's work.
60
00:08:43.200 --> 00:08:48.210
Gabriela: And among all of our work streams. We also have
61
00:08:49.320 --> 00:09:03.540
Gabriela: A focus on European privacy law and global law global trends and when the GDPR was adopted and it had this very strong extraterritorial effect.
62
00:09:05.640 --> 00:09:07.710
Gabriela: That was sort of my
63
00:09:08.760 --> 00:09:25.680
Gabriela: Best way to Pete with this team because I tried to explain how the GDPR becomes a is applicable to entities here in the United States.
64
00:09:26.130 --> 00:09:34.050
Gabriela: And I'm in. Within this context, and my colleagues that are working for the education work stream.
65
00:09:34.590 --> 00:09:42.810
Gabriela: Wanted to dive deeper into the idea of the GDPR being applicable to education institutions here in the US.
66
00:09:43.050 --> 00:09:56.640
Gabriela: So we started to look into the conditions that higher ed particularly higher ed institutions because they are the ones that interact. Most right we have people in Europe are affected by the GDPR
67
00:09:57.570 --> 00:10:15.630
Gabriela: And this is how we came up with the idea of writing a report. I'm or a guide you. She wants to try and help them understand when the GDPR applies to them when he does not apply to them.
68
00:10:16.380 --> 00:10:28.920
Gabriela: And if by any chance the GDPR applies. We also wanted to, to help with information on the latest guidance from European data protection authorities.
69
00:10:29.790 --> 00:10:42.330
Gabriela: On how their compliance should look like they eat they want to sit out on rooms and a framework to protect the data that's coming in from Europe.
70
00:10:43.020 --> 00:10:51.120
Gabe Gumbs: It's excellent it as some of you may know, we have we have a healthy number of listeners that are in the education space, which is why
71
00:10:51.660 --> 00:11:02.100
Gabe Gumbs: We actually have Bob on the line as well. So, Bob, I want to turn the floor over to use it to for others that are in your roles and in with an institution such as yours.
72
00:11:02.760 --> 00:11:05.520
Gabe Gumbs: Maybe you can lead help us lead the conversation and
73
00:11:06.210 --> 00:11:19.020
Gabe Gumbs: Kind of tease out some of that that kind of best practices. But if you could first Bob just kind of first lay out the depth of the challenge that you face how largest can stay. How is it organized and structured, I imagine, similar to other institutions.
74
00:11:22.380 --> 00:11:23.730
Gabe Gumbs: I think you're on mute, up
75
00:11:25.470 --> 00:11:33.990
reckman: There we go. Sorry about that. My dog is barking early and I apologize. So we have roughly 25,000 active students. So maybe 4000 or so employees.
76
00:11:34.770 --> 00:11:40.770
reckman: But overall, we have about 400,000 active identities that we have to protect on a daily basis. We are multi regional
77
00:11:41.220 --> 00:11:50.820
reckman: We have many regional campuses here in Northeast Ohio and elsewhere. And we're also international we have not necessarily campuses, but offices and relationships with schools overseas.
78
00:11:51.240 --> 00:12:03.510
reckman: As well. So obviously the GDPR program in total is a grave interest, not just to myself, but too many schools who are not just doing business in the European Union economic zones, but who are recruiting students from that area as well.
79
00:12:04.620 --> 00:12:18.630
Gabe Gumbs: Excellent. And so as GDPR goes as supply to to higher education institutions here stateside. What were some of the initial challenges that you had in just understanding and interpreting how it might be applicable.
80
00:12:19.170 --> 00:12:29.700
reckman: Yeah, so the applicability of it. You know when you read the GDPR law because it is a law in European Union, and it really is. It's like your typical regulation or law.
81
00:12:30.330 --> 00:12:38.190
reckman: Not always very clear and how they're defined the language at times gets a little amorphous, you're not quite sure how to interpret it.
82
00:12:39.300 --> 00:12:43.590
reckman: And so what we've done is we've really put in place a program that we feel is appropriate for Kent State.
83
00:12:44.190 --> 00:12:50.130
reckman: Following guidance with what many schools in the states are doing just, you know, applying those controls, where we need to apply them.
84
00:12:50.490 --> 00:12:56.700
reckman: But I'll tell you that. I think what it's done. Overall, and I don't want to speak for Kent State necessary. I want to speak for higher education, if you will.
85
00:12:57.360 --> 00:13:05.130
reckman: When it's done is it's made all schools kind of take a step back and look at their privacy program in total, we're starting to see some new trends relative to privacy.
86
00:13:05.400 --> 00:13:14.310
reckman: That we haven't seen previously, one of which is a standing up or something called a privacy office which prior to just a couple years ago, I had never heard of a privacy office before
87
00:13:14.640 --> 00:13:20.340
reckman: But it's a dedicated, you know, sometimes websites sometimes organizations have a chief Privacy Officer role.
88
00:13:21.360 --> 00:13:26.790
reckman: That, you know, with an office and actual people but either way. The idea being is that your one stop shop for privacy.
89
00:13:27.150 --> 00:13:40.110
reckman: That's the place you go to communicate how we're doing privacy how we're approaching privacy, how we treat your data and I really like that approach from a, from a just a confidentiality perspective and making people feel comfortable interacting with the environment.
90
00:13:42.480 --> 00:13:43.620
reckman: Game. I think you're on mute. Now,
91
00:13:45.240 --> 00:13:58.980
Cameron Ivey: So, so I'm curious since you were kind of touching on privacy and this could be a question for both Gabriola and Bob What when you when you hear the term data privacy. What does that mean to you personally and what does it mean to your organization.
92
00:14:00.780 --> 00:14:02.400
reckman: Gabi other doctor, you want to start
93
00:14:03.390 --> 00:14:09.810
Gabriela: I can start. Oh, when I when I hear the term data privacy. I'm I think what comes with
94
00:14:10.980 --> 00:14:12.810
Gabriela: It because
95
00:14:14.340 --> 00:14:25.500
Gabriela: In Europe, you know, we don't over it with the stern, we actually have two terms that we operate, because in the European Union legal framework.
96
00:14:25.980 --> 00:14:36.540
Gabriela: We protect two different right and one of them is the right to privacy or the right to respect for private life and family life and confidentiality of communications.
97
00:14:37.290 --> 00:14:46.830
Gabriela: That's, you know, bubble. And then we have the right to data protection. So, the right to the protection of personal data.
98
00:14:47.310 --> 00:14:57.660
Gabriela: And that's in another bubble. Now the trouble is that sometimes this two bubbles interact and when we talk about data protection.
99
00:14:58.620 --> 00:15:07.320
Gabriela: In Europe we talk about prediction of personal data. So as long as that data. It's personal. You will obviously interact. Many times we
100
00:15:07.890 --> 00:15:28.710
Gabriela: The idea of private life and privacy however data prediction as, as I understand it from the European point of view. I'm refers actually to the fact that personal data can be used. They even should be used.
101
00:15:29.730 --> 00:15:43.230
Gabriela: You know you can collect them and analyze them and use them. However, the way you're doing this needs to respect some rules of the road, you know, need to
102
00:15:43.740 --> 00:15:50.490
Gabriela: If you want to start collecting personal data and using them. You need to have a justification for why you do that.
103
00:15:51.180 --> 00:16:02.850
Gabriela: You need to set up some basic principles, you need to think particularly why you have to collecting this person and they so you have some purpose. I'm specific purpose. Mind you,
104
00:16:04.470 --> 00:16:10.260
Gabriela: You cannot just collect data because at one point in the future, you might find some use for them.
105
00:16:11.340 --> 00:16:23.010
Gabriela: So this is why I'm telling you that when I hear data privacy. I am conflicted because where I come from. We used to distinguish between this two
106
00:16:23.520 --> 00:16:46.350
Gabriela: Concepts, while at the same time recognizes that they relate to each other in that, you know, it's a bit sort of difficult to communicate, particularly to the larger public why this to rights are different and distinct values are conceptualized in a different way in the European Union.
107
00:16:47.460 --> 00:16:51.660
Gabriela: So this is, this is what I think about when you asked me about data privacy.
108
00:16:52.740 --> 00:17:14.340
Gabriela: When I think of privacy. Privacy i is obviously a value that I care a lot about on I think of privacy as a fundamental human right and I hope this will be an approach that will be acknowledged and putting practice in as many jurisdictions as possible.
109
00:17:16.920 --> 00:17:18.210
Cameron Ivey: Awesome, Bob.
110
00:17:18.930 --> 00:17:25.020
reckman: Yeah, that's a really interesting perspective because so I'm on the other side of the house, in some ways,
111
00:17:25.530 --> 00:17:31.080
reckman: Responsible for implementing the controls that keep data private, if you will. And certainly, Dr. I understand the
112
00:17:31.530 --> 00:17:42.120
reckman: trepidation with that we're data and privacy together. It's almost like when you say data privacy should just come along for the ride. We shouldn't have to qualify it by saying data should remain private right
113
00:17:42.690 --> 00:17:47.430
reckman: At the same time, though, I would say that without protection. I really can't have privacy.
114
00:17:48.720 --> 00:17:56.520
reckman: Even though you know in my space. We have to implement controls that ensure the confidentiality and the integrity of that data as well as the availability of that data.
115
00:17:57.090 --> 00:18:02.460
reckman: And to do that we need to appropriately assigned access controls and authorizations and various things.
116
00:18:02.790 --> 00:18:10.290
reckman: That do that if we were to fall regulation purely if we were to look at our various privacy related programs such as hip Govt a
117
00:18:10.800 --> 00:18:18.120
reckman: PCI GDPR and we were to fold those to the letter, you know, I can implement a control that would meet a requirement in almost every one of those
118
00:18:18.390 --> 00:18:25.650
reckman: Different compliance measures, but that doesn't necessarily mean that I have a good security program that also doesn't mean that my data secure
119
00:18:26.010 --> 00:18:35.010
reckman: Right. So when we talk about data privacy to me. What that means is we've implemented, not just the baseline level of security, but we've developed a defense in depth model.
120
00:18:35.280 --> 00:18:41.490
reckman: That's really more risk based in context based where we understand the risk that that data has to our organization.
121
00:18:41.820 --> 00:18:50.070
reckman: Has to the individual right and what it means to them and to us, and as a result of that we've implemented the appropriate level of security to meet that requirement.
122
00:18:50.520 --> 00:19:04.170
reckman: When we begin to do that. What we find is that compliance measures just kind of the run of the manager baseline approach to security and that building on top of that, I really the protections that provide that additional you know protection from the bad guys. I mean, let's be honest.
123
00:19:05.370 --> 00:19:12.270
reckman: Baseline security has been proven time and time again not to be enough, we've seen a number of organizations have implemented.
124
00:19:12.600 --> 00:19:21.270
reckman: Millions of dollars worth of security improvements and they still fail. Why is that, well that's because they don't have control of their data, they don't know where it is. They don't know how it got their
125
00:19:21.780 --> 00:19:27.240
reckman: Data is being moved out of a zero trust environment to an endpoint somewhere and then then emissary attacks the end point.
126
00:19:27.750 --> 00:19:31.590
reckman: So in my mind when I say the words data privacy, although I agree with the doctor that
127
00:19:31.950 --> 00:19:39.360
reckman: Quite frankly data privacy in and of itself should means security, it should mean compliance, it should mean protection defense in depth.
128
00:19:39.690 --> 00:19:49.560
reckman: In my mind, it should encompass all those things. And when we, when we interact with our data we need to understand where that data resides. Why it resides there and who has access to it.
129
00:19:50.160 --> 00:19:54.420
reckman: In my business. It's 100% about authorized versus unauthorized access
130
00:19:55.290 --> 00:20:01.020
reckman: I can provide any level of protection. I want to, but if I allow for an unauthorized person to gain access to the data.
131
00:20:01.440 --> 00:20:15.720
reckman: I'll be in violation of the GDPR rule and others as well here stateside that quite frankly could have been avoided. Had I just done a simple risk assessment looked at the content based approach to that data and understood, who was moving in where it was going and how it got there.
132
00:20:16.110 --> 00:20:17.520
reckman: So in my mind I
133
00:20:17.610 --> 00:20:31.920
reckman: Don't want to say they're synonymous because they're not. I certainly understand the distinction but privacy to me and protection really can't exist without the other, they really need both without privacy. I don't have protection and really without protection. I won't have privacy.
134
00:20:34.110 --> 00:20:45.990
Cameron Ivey: That's really good. Thank you for for going into that. So, Gabriella and this obviously relates to you, Bob. So in the higher ed realm in the United States.
135
00:20:47.130 --> 00:21:00.570
Cameron Ivey: Who does this get assigned to. How do you know how to pick the right person in the organization is this already something that they should know that they should be a part of, or is this something that needs to be kind of structured for each university.
136
00:21:03.660 --> 00:21:04.950
reckman: I'd be happy to take that one first.
137
00:21:04.950 --> 00:21:05.940
Gabriela: Yeah please say
138
00:21:07.800 --> 00:21:14.610
reckman: Yeah, so we, I would say, for the most part this falls on information security teams that I've seen. But, however, I will say,
139
00:21:14.940 --> 00:21:21.540
reckman: Some schools have assigned as I mentioned earlier, like a chief Privacy Officer rule role rather within the organization.
140
00:21:22.230 --> 00:21:26.430
reckman: That can be someone from the General Counsel's Office lawyer organization.
141
00:21:27.240 --> 00:21:36.240
reckman: I know the the GDPR program has the data protection officer, which has to be, you know, have a very high level in the organization has to report to the highest level, the organization.
142
00:21:36.570 --> 00:21:45.780
reckman: And x is really liaison between the organization and the GDPR should there be any issues, challenges, and that individuals responsible for auditing the program as well.
143
00:21:46.530 --> 00:21:50.550
reckman: The data privacy officer role slightly different. That's really more the operational side of it.
144
00:21:50.940 --> 00:21:59.730
reckman: And I've seen that fulfilled by many different roles within an organization, oftentimes the CIO will take that on and some of the smaller schools. Some of the larger schools, you have a dedicated person.
145
00:22:00.300 --> 00:22:14.550
reckman: You know with that title on their door most cases it's, you know, someone is the CSO, quite frankly, but in parentheses could say, you know, Chief risk officer chief privacy officer or chief whatever officer after that many cases we have to kind of play that role.
146
00:22:15.510 --> 00:22:22.710
Cameron Ivey: Is that typically because an organization is too small and they don't have the the means to actually bring on a chief privacy officer.
147
00:22:23.040 --> 00:22:23.550
Cameron Ivey: Yeah, so
148
00:22:23.580 --> 00:22:36.660
reckman: You know, this is complying with regulation is becoming an expensive game. And yeah, I would agree 100% I think some schools have taken that approach other schools have really just quite frankly recognized privacy as part of the security.
149
00:22:36.720 --> 00:22:37.110
Cameron Ivey: Role
150
00:22:37.500 --> 00:22:44.040
reckman: It's really not separate from it's really part of so I could see certainly the relationship there is is key.
151
00:22:44.640 --> 00:22:53.280
reckman: But in our case, you know, we look at privacy, security and compliance all together. We do have a chief compliance officer who drives that for us, which is very grateful for.
152
00:22:53.670 --> 00:22:59.220
reckman: Another phenomenon that I think is changing the game and certainly would welcome Dr. For today's thoughts on this.
153
00:22:59.670 --> 00:23:05.820
reckman: But we are we are also implementing a data governance program. And I think many schools are heading in that direction.
154
00:23:06.120 --> 00:23:12.510
reckman: Where they have a chief data officer who could also fulfill the role as a chief privacy or data protection type role as well.
155
00:23:13.200 --> 00:23:22.380
reckman: For us, it's a matter of identifying classifying all of our data identifying it, you know. And then, and then again content based risk based approach to how we treat it.
156
00:23:24.690 --> 00:23:44.550
Cameron Ivey: So I want to go to identifying data flows. So you mentioned before that GDPR applies to personal data, which is kind of broad. Can we can we kind of touch on for the listeners what that actually means. What kind of what identifies with GDPR data flows.
157
00:23:45.660 --> 00:23:56.580
Gabriela: So, I, I'm happy to take that one. Um, and perhaps this this will be a good segue to also weigh in, into the data officer, you know, type, type of
158
00:23:57.600 --> 00:24:08.910
Gabriela: job titles, we discussed earlier about, so the GDPR indeed applies to personal data and this concept is defined extremely broad
159
00:24:09.570 --> 00:24:23.910
Gabriela: In the GDPR because the legal definition says that it is personal data is any information that relates to an identified or an identifiable or natural person or individual
160
00:24:25.140 --> 00:24:39.000
Gabriela: So technically, it can it can be here any information. So you if you have a shoe size you know number that can be personal data, as long as it relates to
161
00:24:39.420 --> 00:24:54.150
Gabriela: An individual that even not even us identify, but that can be identified. So for example, you have this piece of information us six and a half. Is this shoe size number of
162
00:24:54.930 --> 00:25:00.750
Gabriela: The women in this group of four people that are having a conversation now.
163
00:25:01.260 --> 00:25:09.120
Gabriela: And you know, you don't know that it's Gabrielle out because it doesn't say that. Not the shoe size number is, you know, Gabrielle is
164
00:25:09.450 --> 00:25:26.040
Gabriela: But it's kind of easy to identify me because this is the group of people. I'm the only woman and at least you know so seeming out of someone and then related some piece of information to that person that can be singled out
165
00:25:27.150 --> 00:25:34.620
Gabriela: That is an information becomes personal data. So this is just to give you an idea of how broadly the regulation.
166
00:25:35.220 --> 00:25:56.550
Gabriela: Defines personal data. Now, that was one extreme example, but then you have other information such as a unique IP address, even a dynamic IP address can become personal data. If you have other pieces of information that can be put together with it and we already have.
167
00:25:57.780 --> 00:26:00.870
Gabriela: Cases from the Court of Justice of the European Union that
168
00:26:02.190 --> 00:26:13.620
Gabriela: certainly find this as a true statement. I'm then you can also have inferences about this particular individual.
169
00:26:14.520 --> 00:26:31.950
Gabriela: That's personal data as well. And of course you have the type of information that is you immediately think it's personal data like data worth you know a social security number or similar type of union members.
170
00:26:33.210 --> 00:26:50.940
Gabriela: That are assigned to a person or like a photograph or an image of someone that was no data and then the regulation also covers biometric data, all sorts of biometric data. And if that biometric data is used to uniquely identify a person
171
00:26:52.410 --> 00:27:02.130
Gabriela: Then it becomes sensitive data or a special category of data together with beta, leading to help religion, race.
172
00:27:03.390 --> 00:27:22.170
Gabriela: philosophical beliefs. That's actually categorize special sensitive data under the GDPR as well. So as you as you can see, it really covers a lot of ground and this was one of the main challenges when the GDPR
173
00:27:23.280 --> 00:27:30.480
Gabriela: Started to bring attention to data protection law here in the US, because here.
174
00:27:31.530 --> 00:27:44.700
Gabriela: The privacy laws, mostly cover some well defined pieces of information. So, for example, under HIPAA you have personal like protected health information and this is clearly defined in a specific way.
175
00:27:45.930 --> 00:27:58.170
Gabriela: You usually have some sort of PII personally identifiable information type of category of data that's protected under your privacy law and that usually is defined in a way
176
00:27:58.980 --> 00:28:19.800
Gabriela: Which links your PII to a specific identity and you know you have very clear criteria to follow. So this was a huge challenge on in between 2016 and 2018 when the GDPR entered into force in 2016 often was adopted.
177
00:28:20.310 --> 00:28:22.680
Gabriela: Within that two years period of time.
178
00:28:23.190 --> 00:28:33.990
Gabriela: When entities had to prepare for becoming compliant a huge challenge was to actually identify what data is personal, you know,
179
00:28:35.130 --> 00:28:50.430
Gabriela: From from why they head because in for good reason, the data here was a PII here you know personally identifiable information to your was defined in a different way than the very broad way the GDPR defines it in
180
00:28:51.990 --> 00:29:00.630
Gabe Gumbs: So one of the challenges is a lot of that data was collected for a purpose to begin with and GDPR certainly talks a lot about purpose.
181
00:29:00.990 --> 00:29:09.030
Gabe Gumbs: Now in your report this this guide for higher education universities you lay out these 10 practical steps to begin to GDP or compliance program.
182
00:29:09.300 --> 00:29:15.150
Gabe Gumbs: And one of those particular is establishing a retention schedule for personal data that is subject to GDP.
183
00:29:15.750 --> 00:29:21.750
Gabe Gumbs: So this question is kind of for both yourself Gabriella and especially you Bob, which is
184
00:29:22.230 --> 00:29:37.140
Gabe Gumbs: I understand the necessity to have such a retention schedule in place from a sunscreen standpoint, and we've had data retention schedules in place for quite some time now. But how has the the application GDPR to higher education institutions.
185
00:29:38.250 --> 00:29:48.240
Gabe Gumbs: force you to update force might be the wrong word. But, but how has it affected how you've updated those retention schedules or maybe they haven't been that that much affected at all.
186
00:29:49.440 --> 00:29:56.070
reckman: Well, from, from our perspective, the retention schedule itself has not been impacted a whole lot. We had a pretty conservative approach to retention.
187
00:29:56.520 --> 00:30:05.400
reckman: So it wasn't a huge stretch for us now. Now to carry on this point, she mentioned the the indicators that GDPR identifies as personal
188
00:30:05.670 --> 00:30:13.290
reckman: That include things like IP address and MAC address and all kinds of really interesting notes that begin to pull in things like network logs.
189
00:30:13.560 --> 00:30:21.030
reckman: You know, that was never considered personally identifiable information before. So I think in many ways, schools are still struggling with how to treat this data.
190
00:30:21.780 --> 00:30:28.470
reckman: How to interact with this data, especially those what I'll call fringe personally identifiable information indicators.
191
00:30:28.770 --> 00:30:37.830
reckman: You know, you look at hip hop very clearly there's 18 indicators and HIPAA that are very clear that they're healthcare oriented. Right. But I think something we can't lose sight of
192
00:30:38.550 --> 00:30:43.770
reckman: And that is here in the United States we look at data privacy very differently than they do in Europe.
193
00:30:44.700 --> 00:30:53.010
reckman: In here, the United States if I self report that data gave if I give you that data at that point. I'm giving the rights to that data. I mean, quite frankly,
194
00:30:53.700 --> 00:31:00.600
reckman: If I'm doing that I'm self reporting it to you. I don't have to necessarily get a business associate agreement to tell you I broke my leg last week, right.
195
00:31:00.960 --> 00:31:04.800
reckman: So it's a very different approach where in the GDPR in their program.
196
00:31:05.130 --> 00:31:16.350
reckman: You know, in the European program. Definitely no matter how that data is arrived at you. You have to protect that data under the GDPR rule doesn't belong to you, necessarily. So it's a really interesting way of looking at it.
197
00:31:16.680 --> 00:31:22.860
reckman: Between the United States and I think what GDPR has done is it's kind of flipped the bit here in the United States, quite a bit. It's got us
198
00:31:23.100 --> 00:31:27.030
reckman: Really thinking more conscientiously about how we interact with the data that we use.
199
00:31:27.300 --> 00:31:34.080
reckman: How we know organizations might want to use that data for marketing might not be appropriate to use that data for marketing and letting be legal.
200
00:31:34.410 --> 00:31:39.300
reckman: According to GDPR so I think beyond just retention. I think we've really begin to look at our programs.
201
00:31:39.600 --> 00:31:46.050
reckman: The technologies we use, especially those technologies that tell us a lot about what's happening with the user from a security perspective.
202
00:31:46.440 --> 00:31:56.280
reckman: We have to be very careful about how we implement these tools, who has access to them how we treat that data. I think GDPR is really introduced that to us, more so than any other privacy law to date.
203
00:31:57.450 --> 00:32:00.750
Gabe Gumbs: Interesting, interesting Gabrielle anything on that topic.
204
00:32:01.680 --> 00:32:08.760
Gabriela: Um, no, not really. And that was very, very useful for me to hear, Bob, actually. So thank you for
205
00:32:09.930 --> 00:32:18.810
Gabriela: Specifying that I would, indeed, if I were to add something just to add that data retention.
206
00:32:19.830 --> 00:32:37.320
Gabriela: Is indeed linked to the purpose of why you're collecting the data. Um, so, you know, at the beginning, when I was talking to people that had to put in place retention schedules that was. It was very difficult to give
207
00:32:38.070 --> 00:32:50.460
Gabriela: A specific answer to the question. So, how long should we keep it because the GDPR says that you can keep the data as long as you need it to accomplish the purpose for why you have it.
208
00:32:51.630 --> 00:33:10.110
Gabriela: So then this was, you know, another challenge to identify that purpose. Um, but I will, I will have to say this, that indeed those organizations that already has a solid retention schedule, just like Bob's organization they had much much less difficulty.
209
00:33:11.340 --> 00:33:16.830
Gabriela: You know, to, to sort of clean up the data governance environment.
210
00:33:18.120 --> 00:33:18.840
Gabriela: So,
211
00:33:20.250 --> 00:33:24.030
Gabriela: Yeah, it's some it data retention is definitely one of the
212
00:33:25.320 --> 00:33:27.180
Gabriela: key roles in the GDPR
213
00:33:28.200 --> 00:33:32.490
Gabriela: It's one of the key principles, under Article five
214
00:33:33.690 --> 00:33:43.680
Gabriela: And it does not have a black and white answer. It really depends on the purpose for why you can debate are in it depends also on the legal obligations, you have because you might
215
00:33:44.070 --> 00:33:56.190
Gabriela: You know, you might have some obligations that comes from some sort of audit law or all sorts of retention obligations get come from local laws.
216
00:33:56.730 --> 00:34:04.860
Gabe Gumbs: And Dave switching gears a little bit. This this question actually came in from another listener who's in the higher education space.
217
00:34:05.370 --> 00:34:14.610
Gabe Gumbs: And this university particular is as a state agency, Bob, much, much like yours is only this state agency is covered by sovereign immunity, which
218
00:34:15.000 --> 00:34:28.770
Gabe Gumbs: Bob, I don't know if yours is and I'm not actually sure the distinction between a state agency being covered by sovereign immunity versus not some, maybe we can articulate that some listeners. Also, but their questions basically as an entity that's coming by severely
219
00:34:30.030 --> 00:34:30.810
Gabe Gumbs: Does GDPR
220
00:34:32.070 --> 00:34:36.450
reckman: Well, okay. So let me first say that I'm not a lawyer, and I don't play one on TV.
221
00:34:37.530 --> 00:34:47.460
reckman: All right, we have a very talented DC staff, they could address that question, much better than I can. So I'll defer that to them, if you don't mind. But I will say, I will say this.
222
00:34:47.910 --> 00:34:48.630
reckman: I will say that
223
00:34:48.930 --> 00:34:57.630
reckman: I think most colleges ours included, you know, we're doing everything we reasonably can to implement what I'll call industry standards relative to data privacy.
224
00:34:58.110 --> 00:35:11.160
reckman: And GDPR is a program that we recognize really not not so I don't want to say, not from a legal perspective, I recognize it from a security perspective is just being good orderly direction if nothing else.
225
00:35:12.240 --> 00:35:20.400
reckman: Healthy security practices for data ways that we should really be viewing privacy of data. Anyway, that's just my own personal take
226
00:35:20.790 --> 00:35:25.680
reckman: On the GDPR program that when I first read it, although the ambiguity jumped out at me immediately.
227
00:35:26.340 --> 00:35:32.760
reckman: Like, show me what regulation, who has to be trained on an annual basis. It doesn't speak to that. It just says you train people.
228
00:35:33.300 --> 00:35:39.330
reckman: Okay that's good that's helpful. Right. There's a lot of interpretation there what typically happens with regulation.
229
00:35:40.170 --> 00:35:52.110
reckman: And again, typically, I'm not going to say this is the case with GDPR is that the courts and fines and regulatory findings tend to drive the regulation. And what do I mean by that they're normally adopted by
230
00:35:52.920 --> 00:36:00.720
reckman: You know, political and or legal groups who really have not worked in the trenches of security. So they really don't understand what data security is
231
00:36:01.590 --> 00:36:06.000
reckman: So the draft these laws and these approaches that state, you're going to do certain things.
232
00:36:06.420 --> 00:36:10.230
reckman: And then you get them and you begin to implement them kind of hoping you're doing the right thing.
233
00:36:10.500 --> 00:36:16.920
reckman: Well then, all of a sudden, there's a compliance review done right and an organization gets fined and you read that compliance, we go. Oh my gosh.
234
00:36:17.100 --> 00:36:22.470
reckman: We shouldn't have been doing it that way. We should have been doing it this way instead. And you correct your program and what happens over time is
235
00:36:22.830 --> 00:36:31.800
reckman: More and more correction and finally you get to a point of understanding and I think we're in that were in that growth phase with GDPR now we're all still kind of learning from it.
236
00:36:32.280 --> 00:36:40.290
reckman: But I think we're getting a better picture of what the expectations are. And the more we see of it. I think especially specifically universities, be they state or sovereign or not.
237
00:36:40.620 --> 00:36:44.430
reckman: Really should be implementing these controls, just as part of good healthy direction.
238
00:36:45.420 --> 00:36:50.280
reckman: I've heard some university say they're not going to implement them and they cite sovereign as a as an excuse
239
00:36:50.700 --> 00:36:55.410
reckman: Others say they're not going to do it because it's not applicable. I'm not in the European Union, so it's not apple to me.
240
00:36:55.830 --> 00:37:05.880
reckman: To which I asked them, Do you recruit from the European Union, because if you do, it might be. So I think we just what this law has forced us to do is take a more critical look at our programs, not from a
241
00:37:06.510 --> 00:37:11.700
reckman: Purely you know institutional perspective of protecting our data in intellectual property, which is important to us.
242
00:37:12.390 --> 00:37:17.070
reckman: But also looking at it from the individuals perspective as well. And what that data means to them as a person.
243
00:37:17.460 --> 00:37:25.200
reckman: And that they are entrusting us with this data and these are probably really good practices for all of us to implement, whether it be it be experienced or or
244
00:37:25.980 --> 00:37:38.640
reckman: Or my university or others. Right. So yes, I would. I would answer that with with really that simple statement of I think universities are doing everything they can reasonably to implement this long. And I think that the more we learn the better, we'll get
245
00:37:40.050 --> 00:37:48.840
Gabe Gumbs: Into thank you got real. I know you are also not a lawyer, however, you've done quite a bit of study on this but but yes or no sovereign immunity.
246
00:37:49.440 --> 00:38:00.240
Gabe Gumbs: But Bob, I think, is in the same camp that I certainly am. And many of us are which is if you click that information. I don't see how you could be exempt. But what's your take on it.
247
00:38:00.930 --> 00:38:13.230
Gabriela: Oh I I'm I am technically a lawyer, but back in Europe, not us. So I cannot claim, I would add, because I don't have a JD degree. You should hear, but I do have the equivalent of its
248
00:38:14.250 --> 00:38:17.160
Gabriela: Back in Romania and the you might
249
00:38:17.790 --> 00:38:20.760
Gabe Gumbs: Not know that that's also excellent
250
00:38:21.540 --> 00:38:24.510
Gabriela: And I would say that
251
00:38:25.560 --> 00:38:47.520
Gabriela: You know the extraterritorial clause in the GDPR which is our Article three, paragraph two does not make any sort of distinction for public agencies of other countries. Now, of course, we also have to take into account general international law.
252
00:38:48.780 --> 00:39:03.780
Gabriela: And for example, there's an entire debate of whether the GDPR would apply to an international organization because we know that international organizations this supranational type of bodies have something called Privileges and Immunities.
253
00:39:05.130 --> 00:39:27.210
Gabriela: So they are not subject to the law of any given state. However, that's not the case or generally have a public agency overnight of state, right. So the GDPR does not make that distinction. Um, it simply refers to controllers or processors that offers goods or services to people in the EU.
254
00:39:28.710 --> 00:39:49.200
Gabriela: And we don't have any sort of case Ville or practice that indicates in any way, shape, or form that the GDPR would not be applicable to a public university that's outside of the European Union now of course I would have to look into the details of what this sovereign
255
00:39:49.800 --> 00:39:52.950
Gabe Gumbs: Specific service is here in the
256
00:39:53.490 --> 00:39:55.110
Gabe Gumbs: American way to say, not me.
257
00:39:55.650 --> 00:39:56.310
Yes.
258
00:39:57.750 --> 00:40:23.310
Gabriela: Yes, indeed. So I would say that there is nothing in the European Union law or in international law that would keep an individual submitting a complaint to a data protection authority in the EU if they, for example, cannot obtain a copy of their own application file after
259
00:40:23.310 --> 00:40:24.990
Gabe Gumbs: This the application file to
260
00:40:25.290 --> 00:40:35.160
Gabriela: A STATE UNIVERSITY here. Now, you know, if we go to the. How do we enforce the GDPR to organizations in third countries.
261
00:40:35.250 --> 00:40:37.830
Gabriela: Generally not necessarily Steve bodies.
262
00:40:37.860 --> 00:40:40.740
Gabriela: Now that that's that's an entire other conversation.
263
00:40:41.880 --> 00:40:49.260
Gabe Gumbs: All right. Well, great, great. So I've got that another one. Then I'll stop. Bill gotten the mic here. Now let my partner can get in there. So, Bob,
264
00:40:49.290 --> 00:40:50.040
Cameron Ivey: Yeah, come on.
265
00:40:51.810 --> 00:40:56.610
Gabe Gumbs: As you've had to extremely quickly shift to some online learning.
266
00:40:58.200 --> 00:41:11.910
Gabe Gumbs: Positions and tools and technologies and just implement that into university and I imagine going into the next year, you're gonna have some healthy mix up more students that are going to be disconcerting than before. How's that affecting your privacy program.
267
00:41:13.410 --> 00:41:23.850
reckman: Well, you know, from a student perspective, it's not it's not holistically impacting from a privacy perspective, it's. And again, I think you would find these issues at any university.
268
00:41:24.660 --> 00:41:35.040
reckman: The bigger issue is on the administrative side of the house. It's people that were working in an office now working from home consistently connecting to strange networks with strange machines and coming in through, you know,
269
00:41:36.270 --> 00:41:42.180
reckman: You know if you think in terms of the HIPAA law. You have to control access to a machine that displays pH I
270
00:41:42.750 --> 00:41:46.020
reckman: And so what that means is only authorized individuals can see that screen.
271
00:41:46.470 --> 00:41:56.910
reckman: And so in an office environment we point screens away from public areas we put screen protectors on them. We do all kinds of fun stuff to keep that data from being viewable by an unauthorized individual
272
00:41:57.630 --> 00:42:06.240
reckman: Very similar to when you go to the pharmacy. They make you stand back a certain distance. So you can't interact with supposedly, you're not supposed to be able to hear them but
273
00:42:06.750 --> 00:42:14.880
reckman: You certainly can. But you get the idea. So that becomes a challenge in the home environment is that machine that that individuals interacting with pH is it sitting down on a
274
00:42:15.180 --> 00:42:23.970
reckman: Living room table in front of the family and etc etc. So I think for us and for most universities, it's really just a good hard look at good security practices.
275
00:42:24.450 --> 00:42:31.380
reckman: And we did a tremendous job of putting in place. Keep on working website. We have to keep on working a cantata, you keep on working man.
276
00:42:31.890 --> 00:42:38.970
reckman: Keep on teaching and keep on learning which all those spoke to some semblance of security elements that each one of those areas should consider from an
277
00:42:39.270 --> 00:42:47.010
reckman: Admin faculty and student perspective and certainly included in that were some privacy elements that we felt were important for people to keep in mind as they work from home.
278
00:42:47.370 --> 00:42:52.410
reckman: Use a VPN went through the roof. As I'm sure you could imagine Virtual Private networking became the norm.
279
00:42:52.890 --> 00:42:57.870
reckman: But also we're very fortunate to have a great relationship with Microsoft where we've implemented Microsoft authentication.
280
00:42:58.290 --> 00:43:07.980
reckman: Which really doesn't require our users to go through a VPN, necessarily, they can hit Microsoft directly and get out work in teams and and all that good stuff. And it's put behind you know enhanced
281
00:43:08.880 --> 00:43:25.170
reckman: Authentication measures that Microsoft offers. So for us it really wasn't a huge technical adjustment. It was a tremendous amount of work that we had to do. But I think for most schools, they've struggled with the whole work from home security, peace and we're figuring out as we go. So
282
00:43:25.260 --> 00:43:27.060
Gabe Gumbs: Awesome. Alright.
283
00:43:31.290 --> 00:43:31.590
Cameron Ivey: Me.
284
00:43:33.720 --> 00:43:49.620
Cameron Ivey: I just saw you. I'm just good let's let's turn the page here and get a little get a little free. So I have some fun questions I want to ask you both will start with Gabriella and will be the same question for you, Bob as well. So, Gabriella, what is your, what is your guilty pleasure.
285
00:43:51.390 --> 00:43:55.560
Gabriela: Oh, goodness. I, I would have to say ice cream at midnight.
286
00:43:57.510 --> 00:43:58.620
Cameron Ivey: Don't we all have that
287
00:43:59.820 --> 00:44:03.420
Gabriela: That quite quite, you know quite a lot.
288
00:44:04.950 --> 00:44:07.890
Gabriela: In the past month. For some reason, I wonder why.
289
00:44:09.600 --> 00:44:13.710
Gabriela: Yeah yeah it's at least it's not something worse. I mean, it could be true.
290
00:44:14.760 --> 00:44:15.420
Cameron Ivey: But what about you.
291
00:44:16.050 --> 00:44:22.890
reckman: Uh, it's honestly grazing, you know, now that I'm at home I could just between calls just make my way down to the kitchen and grade.
292
00:44:24.060 --> 00:44:29.010
reckman: My office and man. I'm telling you, I am eat more than I ever
293
00:44:31.620 --> 00:44:36.150
reckman: Since so my co workers we sometimes wear suits at work. And I said, I don't know that I'll be able to fit into any my
294
00:44:37.830 --> 00:44:40.500
Cameron Ivey: Skin I just used to break a few buttons.
295
00:44:40.770 --> 00:44:43.950
Cameron Ivey: Be good. Be good. So
296
00:44:44.010 --> 00:44:48.210
Cameron Ivey: I know that you have, Bob, you have a dog right and then Gabriola. Do you have any pets.
297
00:44:49.080 --> 00:44:50.610
Gabriela: No, I did not have anybody
298
00:44:51.150 --> 00:45:02.220
Cameron Ivey: Okay, so I'm going to have a different question for you. But this one's for you, Bob. You could ask your pet three questions up. I think you went to go get him. If he if you could ask your, your pet three questions. What would it be
299
00:45:03.180 --> 00:45:08.940
reckman: Oh, what would it be, um, well I wish she was here for me to show him to you, but
300
00:45:10.020 --> 00:45:15.180
reckman: terrier. His name is Toby and I swear, he's slightly evil and
301
00:45:16.500 --> 00:45:25.380
reckman: A great little dog but he's got the most amazing personality and he has taught himself how to open and close our sliding door downstairs. So
302
00:45:26.160 --> 00:45:30.480
reckman: Yeah, I would probably ask him why is it he feels that
303
00:45:31.710 --> 00:45:37.710
reckman: That in a literal army is standing at my door. Every time someone loses his mind.
304
00:45:37.800 --> 00:45:38.580
Every single
305
00:45:39.780 --> 00:45:44.310
reckman: Second question would be why do you beg for food so much we feed you every day there's food in your bowl.
306
00:45:44.370 --> 00:45:45.060
All the time.
307
00:45:46.140 --> 00:45:56.460
reckman: So much. And the third question is why do you growl at the cats when there's really no reason to do that. You're perfectly comfortable on the bed, the account walks in the room and he starts growing. That'd be my questions.
308
00:45:57.690 --> 00:46:00.240
Gabe Gumbs: The last one is self evident that cat. Yes.
309
00:46:00.300 --> 00:46:00.660
This one.
310
00:46:02.310 --> 00:46:03.540
Cameron Ivey: They have some kind of
311
00:46:04.620 --> 00:46:06.840
Cameron Ivey: determination to piss off dogs.
312
00:46:07.710 --> 00:46:08.460
Cameron Ivey: Some way or another.
313
00:46:09.750 --> 00:46:13.770
Cameron Ivey: So this one can be both for both of you out if you could be a superhero.
314
00:46:14.130 --> 00:46:15.540
Cameron Ivey: Who would it be and why
315
00:46:17.100 --> 00:46:29.670
Gabriela: Oh goodness that's. This is so funny because I actually have a cartoon over here on my wall that I received from my colleagues in Brussels and it depicts me as superwoman flying over the ocean.
316
00:46:30.540 --> 00:46:31.020
Gabriela: You know,
317
00:46:32.010 --> 00:46:41.250
Gabriela: The GDPR and data prediction here so it has, you know, some sort of wonder when I suppose. That'd be my ideal.
318
00:46:42.330 --> 00:46:43.380
Gabriela: Superhero
319
00:46:44.790 --> 00:46:45.180
Gabriela: Yeah.
320
00:46:45.810 --> 00:46:47.940
Cameron Ivey: Awesome Wonder Woman of GDPR
321
00:46:48.030 --> 00:46:49.920
Gabriela: Yeah. Wonder whenever GDPR yes
322
00:46:50.760 --> 00:46:52.380
Gabe Gumbs: Was that Toby's is that, and now
323
00:46:52.770 --> 00:46:54.420
reckman: It's my man right there as Tony
324
00:46:55.710 --> 00:46:56.250
Hey, Tony.
325
00:46:58.590 --> 00:46:59.010
Cameron Ivey: Is he
326
00:46:59.160 --> 00:47:02.610
reckman: Does he feel like he's standing at the door right now. He looks. He looks like he's ready to go.
327
00:47:02.790 --> 00:47:04.170
reckman: Yes, he wants to get down.
328
00:47:06.630 --> 00:47:10.560
reckman: Superhero traits, I would say no. No doubt about it. Iron Man.
329
00:47:11.070 --> 00:47:14.820
reckman: No doubt about it, one that would be the coolest thing man to have that suit.
330
00:47:16.380 --> 00:47:16.590
reckman: Yeah.
331
00:47:16.620 --> 00:47:18.810
Cameron Ivey: Tony Stark. I mean couldn't be any cooler than that.
332
00:47:19.110 --> 00:47:22.500
reckman: Yeah, I mean it's printed. We could have that kind of money. Yeah, no doubt that brilliant. Oh.
333
00:47:22.920 --> 00:47:32.040
Cameron Ivey: Yeah, that too. So anything we can wrap things up anything that we didn't touch on that either view would like to mention
334
00:47:32.790 --> 00:47:42.630
Cameron Ivey: Do you like people following you on social media. Do you speak at events. Whenever events start happening again. Is there anything that you want to add any kind of value thought leadership. Before we wrap things up.
335
00:47:44.250 --> 00:47:53.190
reckman: I'll go very quickly and then turn it over to the good doctor to to wrap us up here. I'll just say that this has been. First off, really nice conversation good podcast.
336
00:47:53.670 --> 00:48:02.850
reckman: Really got me thinking about privacy slightly different. The whole conversation on privacy and protection was interesting to me something. I'm going to consider further, I think that as
337
00:48:03.690 --> 00:48:08.160
reckman: As an entity, the United States really needs to come to come to grips with what we're doing for privacy.
338
00:48:08.880 --> 00:48:12.660
reckman: I know we have a number of programs. But that's part of the problem is we have a number of programs.
339
00:48:13.530 --> 00:48:21.180
reckman: Heard recently numbers like 20 or 30 states have legislation either past are going to pass soon for privacy.
340
00:48:21.600 --> 00:48:25.740
reckman: I've heard the United States government is considering a privacy bill for the US privacy.
341
00:48:26.160 --> 00:48:37.320
reckman: All this sounds wonderful. But boy, if we're not careful, we're going to end up with 50 some different privacy laws that we're going to have to meet here in the United States. In addition to GDPR in addition to India, Canada, China.
342
00:48:37.380 --> 00:48:52.950
reckman: Your name is Al zero. Thank you very much, Dr. So yeah, so in my opinion. I think I my, my suggestion is, if I could give any parting good doctors to the tinfoil hat wearing security guys and girls out there is that would definitely say build your program for the worst case scenario.
343
00:48:54.000 --> 00:48:59.580
reckman: Plan for the worst hope for the best. So if GDPR is the most restrictive if you want to use that word restrictive.
344
00:48:59.940 --> 00:49:15.240
reckman: Protective program that we have them build your pro build your privacy program around that, because I have a sense that if you could meet that you'll meet the majority of what's out in coming out soon on that just be my suggestion build it with the future in mind, Doctor.
345
00:49:15.810 --> 00:49:21.120
Gabriela: I cannot agree with you more. Bob. This is absolutely great. And I also think that
346
00:49:22.260 --> 00:49:31.440
Gabriela: A federal comprehensive privacy law here in the US is needed, and it's about time we had we get it here.
347
00:49:32.160 --> 00:49:43.830
Gabriela: There have been actually legislative debates back in the 70s, you know, because I mentioned that Europe, but there were also a lot of debates here in the US that time, but unfortunately they did not
348
00:49:44.340 --> 00:49:58.770
Gabriela: lead towards a comprehensive privacy law, but just to allow that much narrower and it applies to federal agencies in some very limited way. Um, I would say that
349
00:49:59.340 --> 00:50:14.610
Gabriela: I really like the attention that personal data in depth privacy have right now in in the world. Technically, as Bob was mentioning, there are many legislative
350
00:50:15.000 --> 00:50:26.250
Gabriela: Initiatives all over the world. You guys going to have a comprehensive law Brazil adopted the law. Couple of years ago and it's going to become applicable soon.
351
00:50:26.790 --> 00:50:37.710
Gabriela: Very similar to the GDPR we have data protection of privacy laws all around the world. And I really like to see that the conversation has evolved so much in
352
00:50:38.970 --> 00:50:48.000
Gabriela: Including here in the US, we are having some discussions that are really, really great to see. And also, last point.
353
00:50:48.900 --> 00:51:01.140
Gabriela: I would really love Eve as part of this conversation we're having in the US. Oh, we are thinking more about this differentiation between on data protection and privacy.
354
00:51:01.590 --> 00:51:09.120
Gabriela: So how do we protect personal data to achieve fairness towards people you know that that can be a goal in itself.
355
00:51:09.960 --> 00:51:18.750
Gabriela: And is that equivalent to protecting the intimate private sphere of someone or we can sort of think of them.
356
00:51:19.500 --> 00:51:34.740
Gabriela: Differently and valuable to them at the same time. So I would like to see some of that conversation going up going on as well. And I'm always happy to interact with people on my Twitter and my LinkedIn where I post quite frequent frequently
357
00:51:35.790 --> 00:51:37.260
Gabe Gumbs: And what is your Twitter handle
358
00:51:38.700 --> 00:51:40.470
Gabriela: It. That's gonna be a tough one game.
359
00:51:42.660 --> 00:51:43.350
Cameron Ivey: EVEN PRONOUNCE IT
360
00:51:43.950 --> 00:51:52.140
Gabriela: It's Gabriola Zen fear and that's Gabriola with one L and then z. Ey, and I are
361
00:51:53.670 --> 00:51:57.060
Gabe Gumbs: We'll make sure we post those post that in the notes as well too.
362
00:51:57.900 --> 00:52:00.270
Gabe Gumbs: How about yourself. You hang out on social media at all.
363
00:52:02.460 --> 00:52:03.060
Cameron Ivey: Up, you're on mute.
364
00:52:07.410 --> 00:52:14.760
reckman: My social media as much as I can. So I do have a LinkedIn profile people are welcome to connect with me, they're doing number of speaking engagements throughout the year and
365
00:52:15.030 --> 00:52:21.660
reckman: Probably the best way to reach me if someone wants to reach me a security. I can't dot edu. That's our that's our catch all and people reach out to me.
366
00:52:21.930 --> 00:52:29.040
reckman: From the general public there quite a bit. But that's, but otherwise I would say just LinkedIn is is a great, great tool. Bob Ackman is my handle
367
00:52:29.730 --> 00:52:32.010
Gabe Gumbs: It's excellent. I appreciate having you both on day
368
00:52:33.300 --> 00:52:34.380
Cameron Ivey: Thank you both so much.
369
00:52:35.010 --> 00:52:36.480
Cameron Ivey: Really, really appreciate your time.
370
00:52:37.200 --> 00:52:38.760
Cameron Ivey: And we'll see you next time.
371
00:52:39.150 --> 00:52:40.080
reckman: Take care. Bye now.