Forensic question: What information is recoverable from the use of Google Assistant when the device is not connected to a vehicle via Android Auto? | ||
OS Version: Android 8.1 (Oreo) | Google Assistant v. 3.8.584564 - Installed 01/16/2019 13:15 (EST) Google v8.91.5.21 - Installed 01/16/2019 13:10 (EST) Maps v10.7.1 - Installed 01/16/2019 13:01 (EST) | |
Tools: |
In part two of this article I will be looking at Google Assistant artifacts that are generated when using a device outside of the car (non-Android Auto). Since this post is a continuation of the first, I will dispense with the usual pleasantries, and jump right into things. If you have not read Part 1 of this post (dealing with the Google Assistant artifacts generated when using Google Assistant via Android Auto), at least read the last portion, which you can do here. The data (the phone extraction) discussed in both posts can be found here. Just know that this part will not be as long as the first, and will, eventually, compare the Google Assistant artifacts generated in Android Auto to those generated just using the device.
However, if you don’t feel like clicking over, let’s recap.
Google Assistant resides in the /data/data directory. The folder is com.google.android.googlequicksearchbox. See Figure 1.
Figure 1. Google Assistant’s home in Android.
This folder also holds data about searches that are done from the Quick Search Box that resides at the top of my home screen (in Oreo). The folder has the usual suspect folders along with several others. See Figure 2 for the folder listings.
Figure 2. Folder listing inside of the googlequicksearchbox folder
The folder of interest here is app_session. This folder has a great deal of data, but just looking at what is here one would not suspect anything. The folder contains several binarypb files, which I have learned, after having done additional research, are binary protocol buffer files. These files are Google’s home-grown, XML-ish rival to JSON files. They contain data that is relevant to how a user interacts with their device via Google Assistant. See Figure 3.
Figure 3. binarypb files
Each binarypb file here represents a “session,” which I define as each time Google Assistant was invoked. Based on my notes, I know when I summoned Google Assistant, how I summoned it, and what I did when I summoned it. By comparing my notes to the MAC times associated with each binarypb file I identified the applicable files for actions taken inside of the car (via Android Auto) and those taken outside of the car.
During my examination of the binarypb files that were created during sessions inside of the car, I found similarities between each file, which are as follows:
Each binarypb file will start by telling you where the request is coming from (car_assistant).
What is last chronologically is first in the binarypb file. Usually, this is Google Assistant’s response (MP3 file) to a vocal input just before being handed off to whatever service (e.g. Maps) you were trying to use. The timestamp associated with this is also at the beginning of the file.
A session can be broken down in to micro-sessions, which I call vocal transactions.
Vocal transactions have a visible line of demarcation by way of the 16-byte string ending in 0x12.
A BNDL starts a vocal transaction, but also further divides the vocal transaction in to small chunks.
The first vocal input in the binarypb file is marked by a 5-byte string: 0xBAF1C8F803, regardless of when, chronologically, it occurred in the session.
Each vocal input is marked by an 8-byte string: 0x014C604080040200. While the 5-byte string appears at the first in the binarypb file only (along with the 8-byte string), the 8-byte string appears just prior to each and every vocal input in the file.
When Google Assistant doesn’t think it understands you, it generates different variations of what you said…candidates…and then selects the one it thinks you said.
In sessions where Google Assistant needs to keep things tidy, it will assign an identifier. There does not appear to be any consistency (as far as I can tell) as to the format of these identifiers.
The end of the final vocal transaction is marked by a 16-byte string ending in 0x18.
Visually, sessions can be seen in Figure 4, and vocal transactions can be seen in Figure 5.
Figure 4. Visual representation of sessions.
Figure 5. Visual representation of vocal transactions.
One additional notation here. I was contacted by a reader via Twitter and asked about adding byte offsets to Figures 4 and 5. Unfortunately, the byte offsets beyond the header are never consistent. This is due to requests always being different, and, as a result, Google Assistant’s response (whether vocally, by action, or both) are always different. I think the thing to keep in mind here is that there is a structure and there are some markers to help examiners locate this data.
To start, I chose 13099.binarypb. This session occurred on 01/28/2019 at 12:41 PM (EST) and involved reading new text messages and dictating a response. The session was initiated by “Ok, Google” while the phone was sitting on my desk in front of me while the phone was unlocked and displaying the home screen. The session went like this:
Me: “Ok, Google.”
// Google Assistant acknowledged //
Me: “Read my new text message.”
GA: “You’ve got new messages from Josh Hickman plus 177089 and one other. The first one is a text from Josh Hickman. It says ‘So, this is your test message for reading texts while not in the car. Make sure you write everything down.’ Do you want to reply?”
Me: “Yes.”
GA: “Alright. What’s the message?”
Me: “Ok. I’ll make sure to document everything I do.”
GA: “So, that’s a message to Josh Hickman saying ‘Ok. I’ll make sure to document everything I do.’ Do you want to send it or change it?”
Me: “Send it.”
GA: “Looks like that message didn’t go through. Sorry. You might want to try again later. Next is a long text message from phone number plus 1 77089. Do you want to hear it?”
Me: “No.”
GA: “That’s the last one.”
Figure 6 shows the top of the binarypb file. In the blue box is something familiar: the 0x155951 hex value at offset 0x10. This string was also present in the binarypb files generated while inside the car (via Android Auto). In the orange box “opa” appears. This string appears at the top of each binarypb file generated as a result of using Google Assistant outside of the car. I suspect (based on other data seen in these files) that this is a reference to the Opa programming language. This would make sense as I see references to Java, too, which is used throughout Android. Additionally, Opa is aimed at both client-side and server-side operations (Node.js on the server and JavaScript on the client side). Again, this is speculation on my part, but the circumstantial evidence is strong.
Figure 6. Top of 13099.binarypb.
In the red boxes are the oh-so-familiar “BNDL’s.” In the green box the string “com.google.android.googlequicksearchbox” is seen. This is the folder in which the Quick Search Box resides, along with the session files for Google Assistant.
Just below the area in Figure 6 is the area in Figure 7. There are a couple of BNDL’s in this area, along with the area in the orange. This string appears to be indicating this part of the file was caused by a change in the conversation between Google Assistant and myself; “TRIGGERED_BY” and “CONVERSATION_DELTA.” See Figure 7.
Figure 7. A change in conversation triggered this vocal transaction
The area in the blue box is interesting as it is a string that is repeated throughout this session. I suspect…loosely…this is some type of identifier, and the string below it (in Figure 8) is some type of token.
Figure 8. ID with possible token…?
I will stop here for a second. There was a noticeable absence at the top of this file. There was no MP3 data here. A quick scan of this entire file finds no MP3 data at all. Determining whether this is unique to this particular file or systemic trend will require examining other files (later in this article).
After the area in Figure 8 there was quite a bit of protocol buffer data. Eventually, I arrived at the area depicted in Figure 9. In it you can see the identifier from Figure 7 (blue box), a bit more data, and then a time stamp (red box). The value is 0x65148E9568010000, which, when read little endian is 1548697343077 (Unix Epoch Time). Figure 10 shows the outcome using DCode.
Figure 9. Identifier and Unix Epoch Time time stamp.
Figure 10. Time stamp from Figure 9.
The time stamp here is about a minute ahead of when I initiated the session. Remember what I said about the last thing chronologically being the first thing in the file? I suspect the last thing I said to Google Assistant will be the first vocal input data I see. See Figure 11.
Figure 11. Last vocal input of the session.
There is one bit of familiar data in here. If you read the first part of this article you will know that the string in the blue box (0xBAF1C8F803) appeared just before the first vocal input in a binarypb file, which is usually the last vocal input data of the session. It did not appear anywhere else within the file. It appears here, too, in a session outside of the car.
In the orange box is what appears to be some Java data indicating where this session started: “hotword.” The hotword is the trigger phrase for Google Assistant, which, for me is “Ok, Google.” The 8-byte string in the green box (0x010C404000040200) is consistent throughout the file (save one location – discussed later), and, as suspected, my last vocal input that I provided Google Assistant (purple box). A BNDL appears at the end in the red box.
Figure 12 shows some familiar data (from Figures 7 & 8): TRIGGERED_BY, CONVERSATION_DELTA, the identifier (blue box) and what I believe to be some token (red box). Note that the suspected token here matches that seen in Figure 8.
Figure 12. A rehash of Figures 7 & 8.
Figure 13. The identifier again and another time stamp.
After some more protocol buffer data I find the area in Figure 13. It looks the same as the area shown in Figure 9, and the time stamp is the same.
Figure 14 is a somewhat recycled view of what was seen in Figure 11, but with a twist. The Java data which seems to indicate where the query came from wraps the vocal input (“no”); see the orange box. A BNDL is also present
Figure 14. Vocal input with a Java wrapper.
Also seen in Figure 14 is another time stamp in the red box. The value is 0x65148E9568010000, which is decimal 1548697279859. As before, I used DCode to convert this from Unix Epoch Time to 01/28/2019 at 12:41:23 (EST). This is the time I originally invoked Google Assistant.
Figure 15 shows some more data, and the end of the vocal transaction (see my Part 1 post). This is marked by the velvet:query_state:search_result_id string (purple box) and the 16-byte hex value of 0x00000006000000000000000000000012 (orange box). The string and accompanying hex value are the same ones seen in the binarypb files generated by interaction with Google Assistant via Android Auto.
Figure 15. Data marking the end of the vocal transaction.
Figure 16 shows the start of a new vocal transaction. The BNDL (seen at the bottom of Figure 15, but not marked) is in the red box. Just below it is the 8-byte string in the green box. Note that the last byte is 0x10 and not 0x00 as seen in Figure 11. My vocal input appears in the purple box; this input is what started the session. Just below it is another BNDL. See Figure 16.
Figure 16. New vocal transaction
The items below the BNDL are interesting. The orange box is something previously seen in this file: TRIGGERED_BY. However, the item in the blue box is new. The string is QUERY_FROM_HOMESCREEN, which is exactly what the phone was displaying when I invoked Google Assistant. The phone was on, unlocked, and I used the hotword to invoke Google Assistant, which leads me to the string in the brown box: “INITIAL_QUERY.” The phrase “read my new text messages” was my original request. This area seems to imply that my phrase was the initial query and that it was made from the home screen. Obviously, there is plenty of more testing that needs to be done to confirm this, but it is a good hypothesis.
Figure 17. A time stamp and a “provider.”
In Figure 17 there is a time stamp (red box): the decimal value is 1548697279878 (Unix Epoch Time) and the actual time is 01/28/2019 at 12:41:19 (EST). Again, this is the time Google Assistant was invoked. The portion in the blue box, while not a complete match, is data that is similar to data seen in Android Auto. I highlighted the whole box, but the area of interest is voiceinteraction.hotword.HotwordAudioProvider /34. In the Android Auto version, the related string was projection.gearhead.provider /mic /mic. In the Part 1 post, I indicated that the /mic /mic string indicated where the vocal input was coming from (my in-car microphone used via Android Auto). Here I believe this string indicates the origin of the Google Assistant invocation is via the hotword, although I am not completely sure about the /34.
The area in the blue box in Figure 18 is new. I have tried to find what the data in the box means or its significance, and I have been unable to do so. In addition to searching the Google developer forums, I pulled the phone’s properties over ADB in order to see if I could determine if the data was referring to the on-board microphone and speaker (ear piece), but the list of returned items did not have any of this data. At this point I have no idea what it means. If someone knows, please contact me and I will add it to this article and give full credit.
Figure 18. Something new.
I had to scroll through some more protocol buffer data to arrive at the area in Figure 18-1. There are several things here: the velvet:query_state:search_result_id with the accompanying 16-byte string ending in 0x12 (brown boxes), BNDLs (red boxes), the 8-byte string just prior to my vocal input (green box), my vocal input (purple box), the TRIGGERED_BY, CONVERSATION_DELTA strings (orange box – my response “yes” was a result in a change in the conversation), and the identifier that I had seen earlier in the file (blue box). Note that while the string in the green box matches the string seen in Figure 11, it differs from the one seen in Figure 18-1. The string in Figure 18-1 ends in 0x10 whereas the string here and Figure 11 both end in 0x00.
Figure 18-1. The end of one vocal transaction and the beginning of another.
Just past the identifier seen in Figure 18-1, there was another string that I suspect is a token. This string starts out the same as the one seen in Figures 8 and 12, but it does differ. See Figure 19.
Figure 19. A new “token.”
Scrolling through more protocol buffer data finds the area seen in Figure 20. Here I find another time stamp (red box). The decoding methodology is the same as before, and it resulted in a time stamp of 01/28/2019 at 12:41:42 (EST). This would have been around the time that I indicated that I wanted to reply to the text messages (by saying “yes”) Google Assistant had read to me. Additionally, the Java string appears (orange box), and the end of the vocal transaction is seen with the velvet:query_state:search_result_id and the accompanying 16-byte string ending in 0x12 (blue boxes).
Figure 20. The end of another vocal transaction.
Figure 21 has my dictated message in it (purple box), along with some familiar data, and a familiar format.
Figure 21. A familiar face.
At the top is a BNDL (red box), the 8-byte string ending in 0x00 (green box), another BNDL (red box), the TRIGGERED_BY, CONVERSATION_DELTA strings (orange box), and the identifier again (blue box). In Figure 22 another “token” is found (red box). This is the same one as seen in Figure 19.
Figure 22. Another “token.”
Yet more protocol buffer data, and yet more scrolling takes me to the area in Figure 23. In the red box is another time stamp. In decimal it is 1548697307562 (Unix Epoch Time), which converts to 01/28/2019 at 12:41:47 (EST). This would have been around the time I dictated my message to Google Assistant. The identifier also appears at the foot of the protocol buffer data (blue box).
Figure 23. Another time stamp.
Figure 24 shows the same data as in Figure 20: the end of a vocal transaction. The orange box contains the Java data, and the blue box contains the velvet:query_state:search_result_id and the accompanying 16-byte string ending in 0x12.
Figure 24. End of another vocal transaction.
Beyond my vocal input (purple box), the area seen in Figure 25 is the same as those seen in Figures 18-1 & 21. I even marked them the same… BNDL (red box), the 8-byte string ending in 0x00 (green box), another BNDL (red box), the TRIGGERED_BY, CONVERSATION_DELTA strings (orange box), and the identifier again (blue box).
Figure 25. The top of another vocal transaction.
Figure 26 shows an area after some protocol buffer data that trailed the identifier in Figure 25. The notable thing here is the time stamp in the red box. It is decimal 1548697321442 (Unix Epoch Time), which translate to 01/28/2019 at 12:42:01 (EST). This would have lined up with when I sent the dictated text message.
Figure 26. Time stamp from “No.”
Figure 27 shows the end of the vocal transaction here. In the orange box is the Java data, with the velvet:query_state:search_result_id and the accompanying 16-byte string ending in 0x12 in the blue box.
Figure 27. The end of a vocal transaction.
Figure 28 looks just like Figures 18-1, 21 & 25. The only difference here is my vocal input (“no”). This was the last thing I said to Google Assistant in this session, so I expect this last portion of the file (save the very end) to look similar to the top of the file.
Figure 28. Look familiar?
Figure 29 contains a time stamp (red box), which appears after a bit of protocol buffer data. It is decimal 1548697343077 (Unix Epoch Time), which converts to 12:42:23 (EST). This is the same time stamp encountered in this session file seen in Figure 9.
Figure 29. The last/first time stamp.
Figure 30 shows the end of the session file with the orange box showing the usual Java data. The end of this file, as it turns out, looks very similar to end of session files generated via Android Auto. Three things are present here that are also present in the end of the Android Auto session files. First, the velvet:query_state:search_result_id and the accompanying 16-byte string ending in 0x18 in the blue box. Second, the 9-byte string, 0x01B29CF4AE04120A10 in the purple box. Third, the string “and.gsa.d.ssc.” is present in the red box.
Figure 30. A familiar ending.
So, right away I see quite a bit of similarities between this session file and the ones generated by Android Auto. In order to have some consistency between these files and those from Android Auto, the next file I examined involved me asking for directions to my favorite coffee joint.
The next file I examined was 13128.binarypb. This session occurred on 01/28/2019 at 12:43 PM (EST) and involved reading new text messages and dictating a response. The session was initiated by “Ok, Google” while the phone was sitting on my desk in front of me, unlocked, and displaying the home screen. The session went like this:
Me: “Ok, Google.”
// Google Assistant acknowledged //
Me: “I need directions to the Starbucks in Fuquay Varina North Carolina.”
GA: “The best way to get to Starbuck by car is via US-401 South and will take about 23 minutes in light traffic.”
The screen switched over to Google Maps and gave me the route and ETA. I did not choose anything and exited Maps.
The top of 13128.binarypb looks identical to 13099.binarypb (Figure 6). See Figure 31.
Figure 31. A familiar sight.
The gang is all here. The string 0x155951 (blue box), “opa” (orange box), com.google.android.googlequicksearchbox (green box), and a couple of BNDL’s (red box).
While no data of interest resides here, I am including Figure 32 just to show that the top of 13128 is just like 13099.
Figure 32. Nothing to see here.
Figure 33 is something I had seen in the previous file (see Figure 16), but further down. The blue and orange boxes contain the TRIGGERED_BY and QUERY_FROM_HOMESCREEN strings, respectively. Just like my previous session, this session was started with the phone on, unlocked, and by using the hotword to invoke Google Assistant, which leads me to the string in the red box: “INITIAL_QUERY.” This area seems to imply that whatever vocal input is about to show up is the phrase that was the initial query and that it was made from the home screen.
Figure 33. Query From Home Screen, Triggered By, Launched On, & Initial Inquiry.
Figure 34 looks almost identical to Figure 18-1. The red box contains a time stamp, which is decimal 1548697419294 (Unix Epoch Time). When converted it is 01/28/2019 at 12:43:39 (EST). The blue box contains the string voiceinteraction.hotword.HotwordAudioProvider /49. The /49 is different than the one seen in Figure 18-1, though (/34). Again, I am not sure what this is referring to, and I think it warrants more testing.
Figure 34. The query source and a time stamp.
Scrolling down just a hair finds the area in Figure 35. The orange box contains Java data we have seen before but with a small twist. The string webj and.opa.hotword* search and.opa.hotword, with the twist being “search” in the middle. As seen in the first file, it’s almost as if the term in the middle is being wrapped (my “no” was in wrapped as seen in Figure 14).
Figure 35. Something old and something old.
The area in the red box is the same data seen in Figure 18.
Figure 36 also contains some familiar faces. My vocal input is in the purple box, the 5-byte blue string that usually appears at the first vocal input of the session, 0xBAF1C8F803, is here.
Figure 36. The first vocal input of the session.
An 8-byte string previously seen in 13099 is also here (see Figure 16). Note that this string ends in 0x10. In 13099 all of the 8-byte strings, save one, ended in 0x00. The one that did end in 0x10 appeared with the first vocal input of the session (“read my new text messages”). Here, we see the string ending in 0x10 with the only vocal input of the session. I hypothesize that the 0x10 appears before the first vocal input of the session, with any additional vocal input appearing with the 8-byte string ending in 0x00. More research is needed to confirm, which is beyond the scope of this article.
Figures 37 and 38 shows the same data as seen in Figure 33 and 34.
Figure 37. Same ol’ same ol’.
Figure 38. Same ol’, part deux.
Note that the time stamp seen in Figure 39 is the same as in Figure 34.
Figure 39 shows the mysterious string with the speaker id (red box) and Figure 40 shows my vocal input inside of a Java wrapper (orange box), which is similar to what was seen in 13099 (Figure 14).
Figure 39. Speaker identifier?
Figure 40. A Java wrapper and a time stamp.
The time stamp seen in Figure 40 is the same as the other time stamps seen in this session file except for the first byte. The other bytes are 0x1E, whereas the byte seen here is 0x08; this causes the decimal value to shift from 1548697419294 to 1548697419272. Regardless, the time here is the same: 01/28/2019 at 12:43:39 PM (EST). The millisecond value is different: 294 versus 272, respectively.
Figure 41 shows the end of the vocal transaction, which is marked by the velvet:query_state:search_result_id and the accompanying 16-byte string ending in 0x12 in the blue box.
Figure 41. The end of the vocal transaction.
The start of a new vocal transaction is seen in Figure 42. The 8-byte value seen in the green box ends with 0x10, which keeps in line with my theory discussed earlier in this article. My vocal input (the only input of the session) is seen in the purple box. A BNDL is seen at the start of the transaction (red box) with another one at the end (red box).
Figure 42. The start of another vocal transaction.
In the interest of brevity, I will say that the next bit of the session file is composed of what is seen in Figures 37, 38, and 39 (in that order). The time stamp is even the same as the one seen in Figure 38. The next area is the last part of the session file as seen in Figure 43.
Figure 43. The end!
If Figure 43 looks familiar to you, that is because it is. I color coded the boxes the same way as I did in Figure 31. Everything that was there is here: the Java data (orange box), the velvet:query_state:search_result_id and the accompanying 16-byte string ending in 0x18 in the blue box, the 9-byte string, 0x01B29CF4AE04120A10 in the purple box, and the string “and.gsa.d.ssc.” is present in the red box.
At the beginning of this article I reviewed some consistencies between the Android Auto session files I examined. After examining he non-Android Auto files, I thought it would be beneficial to revisit those consistencies to see what, if anything changed. The original statements are in italics, while the status here is just below each item.
Each binarypb file will start by telling you where the request is coming from (car_assistant).
This is still correct except “car_assistant” is replaced by “opa” and “googlequicksearchbox.”
What is last chronologically is first in the binarypb file. Usually, this is Google Assistant’s response (MP3 file) to a vocal input just before being handed off to whatever service (e.g. Maps) you were trying to use. The timestamp associated with this is also at the beginning of the file.
This is still correct, minus the part about the MP3 data.
A session can be broken down in to micro-sessions, which I call vocal transactions.
This is still correct.
Vocal transactions have a visible line of demarcation by way of the 16-byte string ending in 0x12.
This is still correct.
A BNDL starts a vocal transaction, but also further divides the vocal transaction in to small chunks.
This is still correct.
The first vocal input in the binarypb file is marked by a 5-byte string: 0xBAF1C8F803, regardless of when, chronologically, it occurred in the session.
This is still correct.
Each vocal input is marked by an 8-byte string: 0x014C604080040200. While the 5-byte string appears at the first in the binarypb file only (along with the 8-byte string), the 8-byte string appears just prior to each and every vocal input in the file.
Eh…sorta. While the values in the 8-bytes change between Android Auto and non-Android Auto, there is some consistency within the fact that there is a consistent 8-byte string. Further, the last byte of the 8-byte string of the non-Android version varies depending on whether or not the vocal input is chronologically the first input of the session.
When Google Assistant doesn’t think it understands you, it generates different variations of what you said…candidates…and then selects the one it thinks you said.
Unknown. Because I was in an environment which was quiet, and I was near the phone, Google Assistant didn’t seem to have any trouble understanding what I was saying. It would be interesting to see what would happen if I introduced some background noise.
In sessions where Google Assistant needs to keep things tidy, it will assign an identifier. There does not appear to be any consistency (as far as I can tell) as to the format of these identifiers.
This is correct. In the 13099 file, there were multiple things happening, so an identifier with something that resembled a token was present.
The end of the final vocal transaction is marked by a 16-byte string ending in 0x18.
Still correct.
For those of you that are visual learners, I am adding some diagrams at the end that shows the overall, generalize structure of both a session and a vocal transaction. See Figures 44 and 45, respectively.
Figure 44. Session file.
Figure 45. Vocal transaction.
There is way more work to do here in order to really understand Google Assistant. Phil Moore, of This Week in 4n6 fame, mentioned the Part 1 of this article recently on the This Month in 4N6 podcast, and he made a very accurate statement: Google Assistant is relatively under researched. I concur. When I was researching this project, I found nothing via Google’s developer forums, and very little outside of those forums. There just isn’t a whole lot of understanding about how Google Assistant behaves and what it leaves behind on a device.
Google Assistant works with any device that is capable of running Lollipop (5.0) or higher; globally, that is a huge install base! Additionally, Google Assistant can run on iOS, which adds to the install base and is a whole other batch of research. Outside of handsets there are the Google Home speakers, on which there has been some research, Android TVs, Google Home hubs, smart watches, and Pixelbooks/Pixel Slates. Google is making a push in the virtual assistant space, and is going all in with Google Assistant (see Duplex). With the all of these devices capable of running Google Assistant it is imperative that practitioners learn how it behaves and what artifacts it leaves behind.
This topic is under-researched and the author has provided excellent initial research in this area. The paper explains the analysis of Google’s binary protocol buffer files. The included screenshots are helpful as they support the author’s findings. The data provided was helpful to allow the verification of the author’s findings.
Further testing could be conducted in different environments to determine differences in artifacts. This would be especially helpful in determining differences between a “quiet area” and a “noisy environment”. Additionally, an automated tool would be helpful to pull out user voice interactions from the binarypb files.
Brett Shavers (Verified Review using Author Provided Datasets)
Terrence Nemayire (Methodology Review)
Prashanth Malise (Methodology Review)