PDFs and exam guides are not so efficient, right? Prepare for your Google examination with our training course. The Professional Cloud Developer course contains a complete batch of videos that will provide you with profound and thorough knowledge related to Google certification exam. Pass the Google Professional Cloud Developer test with flying colors.
Curriculum for Professional Cloud Developer Certification Video Course
Name of Video | Time |
---|---|
1. What is a GCP Professional Cloud Developer? |
3:00 |
2. Cloud Developer Exam Objectives |
5:00 |
3. Cloud Developer Deep Dive Notes |
2:00 |
4. Course Setup |
2:00 |
Name of Video | Time |
---|---|
1. Course Material Download |
2:00 |
2. What are Test Tips? |
2:00 |
3. Whiteboards and Demos |
1:00 |
4. Sign up for Free Tier or Credits if available |
2:00 |
5. GCP Pricing Calculator |
3:00 |
6. SDK Install |
10:00 |
7. Read the GCP Cloud Developer Deep Dive Notes. |
2:00 |
Name of Video | Time |
---|---|
1. Google Cloud Intro (Condensed Version) |
8:00 |
2. GCP Hierarchy |
9:00 |
3. Whiteboard GCP Hierarchy |
6:00 |
4. Demo - Projects |
5:00 |
5. Compute Options |
4:00 |
Name of Video | Time |
---|---|
1. Compute Engine |
14:00 |
2. Compute Engine Demo |
8:00 |
3. Instance Groups |
2:00 |
4. Instance Groups Demo |
10:00 |
Name of Video | Time |
---|---|
1. App Engine Security Scanner |
6:00 |
2. App Engine Demo |
9:00 |
3. App Engine or Kubernetes Engine |
8:00 |
Name of Video | Time |
---|---|
1. Kubernetes Engine |
9:00 |
2. Demo- Kubernetes Engine |
6:00 |
3. Whiteboard - Kubernetes |
10:00 |
Name of Video | Time |
---|---|
1. Cloud Functions Basics |
7:00 |
2. Cloud Functions Demo |
4:00 |
Name of Video | Time |
---|---|
1. Data Storage Basics |
12:00 |
2. Cloud Storage Basics |
10:00 |
3. Cloud Storage |
16:00 |
4. Cloud Storage Demo Part2 |
13:00 |
Name of Video | Time |
---|---|
1. Networking Overview |
10:00 |
2. VPC Overview |
11:00 |
3. IP Addressing |
3:00 |
4. Networking Whiteboard |
11:00 |
5. Networking Demo |
4:00 |
6. VPC Whiteboard |
5:00 |
7. Hybrid Connectivity Networking Whiteboard |
6:00 |
Name of Video | Time |
---|---|
1. Download SDK |
10:00 |
2. Devops on GCP Part 1 |
17:00 |
3. Devops on GCP Part 2 |
6:00 |
4. Cloud Source Repositories Demo |
9:00 |
5. Cloud Build Demo |
5:00 |
6. Demo Gcloud Commands |
3:00 |
Name of Video | Time |
---|---|
1. Section 1 Coverage |
2:00 |
2. 1.1 Designing performant applications and APIs |
1:00 |
3. Infrastructure as a Service vs. Container as a Service vs. Platform as a Service |
3:00 |
4. Portability vs. platform-specific design |
4:00 |
5. Evaluating System Considerations |
7:00 |
6. Operating system versions and base runtimes of services |
3:00 |
7. Service Locality |
3:00 |
8. Whiteboard Service Locality |
6:00 |
9. Locality Test Tips |
3:00 |
10. Microservices |
3:00 |
11. Whiteboard- Microservices |
5:00 |
12. Testips - Microservices |
1:00 |
13. Defining a key structure for high write applications using Cloud Storage, Cloud |
7:00 |
14. Defining a key structure Test Tips |
1:00 |
15. Session management |
5:00 |
16. Test Tips Session Management |
2:00 |
17. Loosely coupled applications using asynchronous Cloud Pub/Sub events |
3:00 |
18. Demo Pub Sub |
3:00 |
19. Test Tips Cloud Pub Sub |
1:00 |
20. Deploying and securing an API with cloud endpoints |
11:00 |
21. Demo Cloud Endpoints |
4:00 |
22. Test Tips -API Management |
2:00 |
23. Health checks |
4:00 |
24. TestTips Healthchecks |
1:00 |
25. Google-recommended practices and documentation |
5:00 |
26. 1.2 Designing secure applications |
1:00 |
27. Applicable regulatory requirements and legislation |
4:00 |
28. TestTips Regulatory Requirements |
1:00 |
29. Security mechanisms that protect services and resources |
6:00 |
30. TestTips Security Mechanisms |
1:00 |
31. Storing and rotating secrets |
12:00 |
32. Secrets Whiteboard |
6:00 |
33. TestTips Secrets |
3:00 |
34. IAM roles for users/groups/service accounts |
9:00 |
35. IAM Whiteboard |
2:00 |
36. TestTips IAM |
3:00 |
37. HTTPs certificates |
1:00 |
38. Test Tips Https Certificates |
1:00 |
39. Demo Https Certificates |
2:00 |
40. Google-recommended practices and documentation |
2:00 |
41. Defining database schemas for Google-managed databases |
7:00 |
42. Session Management |
5:00 |
43. TestTips Session Management |
2:00 |
44. Loosely Coupled Apps – Cloud Pub/Sub |
3:00 |
45. Demo - Loosely Coupled Apps – Cloud Pub/Sub |
3:00 |
46. Whiteboard - Cloud Pub/Sub |
5:00 |
47. TestTips Loosely Coupled Apps – Cloud Pub/Sub |
1:00 |
48. Deploying and securing an API with cloud endpoints |
11:00 |
49. Demo Deploying and securing an API with cloud endpoints |
4:00 |
50. TesTips Deploying and securing an API with cloud endpoints |
2:00 |
51. TestTips Health Checks |
4:00 |
52. Health Checks |
4:00 |
53. Choosing data storage options based on use case considerations |
6:00 |
54. TestTips Data Storage |
1:00 |
55. Working with data ingestion systems |
9:00 |
56. Following Google-recommended practices and documentation |
5:00 |
57. Using managed services |
4:00 |
58. Using the strangler pattern for migration |
3:00 |
59. Strangler Whiteboard |
5:00 |
60. Codelabs - Exercise for Practice- Cloud Functions |
2:00 |
61. Codelabs - Cloud Pub/sub |
1:00 |
62. Google-recommended practices and documentation |
5:00 |
63. Section Review Questions |
8:00 |
Name of Video | Time |
---|---|
1. Section 2 Building and Testing Applications |
2:00 |
2. Local application development emulations |
1:00 |
3. Developer Tools and SDK |
4:00 |
4. Demo SDK Install and basic commands |
13:00 |
5. Demo SDK Emulators |
6:00 |
6. Demo CLI Create Project |
9:00 |
7. 2.2 Building a continuous integration pipeline |
1:00 |
8. Creating a Cloud Source Repository and committing code to it |
8:00 |
9. DevOps and Pipelines |
10:00 |
10. Developing unit tests for all code written |
7:00 |
11. Developing an integration pipeline using services |
4:00 |
12. Reviewing test results of continuous integration pipeline |
1:00 |
13. TestTips |
2:00 |
14. Whiteboard - DevOps |
5:00 |
15. 2.3 Testing. Considerations include: |
1:00 |
16. Performance Testing |
9:00 |
17. Whiteboard Testing |
9:00 |
18. TestTips Testing |
3:00 |
19. Algorithm design |
1:00 |
20. Modern application patterns |
2:00 |
21. Efficiency |
4:00 |
22. TestTips |
2:00 |
23. Section Review Questions |
9:00 |
Name of Video | Time |
---|---|
1. Section 3 |
3:00 |
2. 3.1 Implementing appropriate deployment strategies based on the target compute |
1:00 |
3. Blue Green Deployments |
6:00 |
4. Whiteboard App Engine |
8:00 |
5. Demo App Engine |
3:00 |
6. 3.2 Deploying applications and services on Compute Engine |
1:00 |
7. Launching a compute instance using GCP Console and Cloud SDK |
8:00 |
8. Moving a persistent disk to different VM |
6:00 |
9. Creating an autoscaled managed instance group using an instance template |
11:00 |
10. Generating/uploading a custom SSH key for instances |
4:00 |
11. Configuring a VM for Stackdriver monitoring and logging |
5:00 |
12. Creating an instance with a startup script that installs software |
3:00 |
13. Creating custom metadata tags |
5:00 |
14. Creating a load balancer for Compute Engine instances |
10:00 |
15. 3.3 Deploying applications and services on Google Kubernetes Engine |
1:00 |
16. Deploying a GKE cluster |
4:00 |
17. Kubenetes Engine Whiteboard |
10:00 |
18. Kubenetes Engine Clusters Demo |
18:00 |
19. TestTips |
2:00 |
20. 3.4 Deploying an application to App Engine. Considerations include: |
1:00 |
21. Scaling configuration |
10:00 |
22. GKE or App Engine |
8:00 |
23. TestTips App Engine |
3:00 |
24. Cloud Functions that are triggered via an event |
5:00 |
25. Cloud Functions that are invoked via HTTP |
2:00 |
26. 3.6 Creating data storage resources. Tasks include: |
1:00 |
27. Creating a Cloud SQL instance |
4:00 |
28. Cloud Datastore |
13:00 |
29. Creating BigQuery datasets |
3:00 |
30. Creating a Cloud Storage bucket |
7:00 |
31. Creating a Cloud Pub/Sub topic |
5:00 |
32. TestTips Create data storage |
1:00 |
33. 3.7 Deploying and implementing networking resources. Tasks include: Creating an |
1:00 |
34. Creating an auto mode VPC with subnets |
4:00 |
35. Setting up a domain using Cloud DNS |
4:00 |
36. TestTips Networking |
3:00 |
37. 3.8 Automating resource provisioning with Deployment Manager |
1:00 |
38. Deployment Manager |
2:00 |
39. Deployment Manager Demo |
7:00 |
40. TestTips |
1:00 |
41. 3.9 Managing Service accounts. Tasks include: Creating a service account with a |
1:00 |
42. Service Accounts |
4:00 |
43. Save KeyFile |
3:00 |
44. Codelabs |
1:00 |
45. TestTips |
1:00 |
Name of Video | Time |
---|---|
1. Section 4 Overview |
1:00 |
2. Enable Bigquery and permissions on dataset |
11:00 |
3. SQL Searches/Selects |
4:00 |
4. Whiteboard- Fetching Ingesting data |
4:00 |
5. Codelab -Biquery |
1:00 |
6. BigTable or BigQuery |
2:00 |
7. Writing an SQL query to retrieve data from relational databases |
9:00 |
8. Connecting to SQL |
3:00 |
9. Gsutil Storing and retrieving objects from Google Storage |
4:00 |
10. Quickstart - Cloud Storage GsUtil |
1:00 |
11. Connecting to a Cloud SQL instance |
3:00 |
12. Enabling Cloud Spanner and configuring an instance |
8:00 |
13. Cloud Spanner Whitepaper |
1:00 |
14. Demo -Cloud Spanner |
16:00 |
15. Cloud Spanner Best Practices |
5:00 |
16. DataProc or Dataflow |
1:00 |
17. TestTips |
4:00 |
18. Configuring a Cloud Pub/Sub push subscription to call an endpoint |
5:00 |
19. Data Ingestion Sources |
3:00 |
20. 4.2 Integrating an application with compute services. Tasks include: |
1:00 |
21. Provisioning and configuring networks |
11:00 |
22. Writing an application that publishes/consumes from Cloud Pub/Sub |
5:00 |
23. Authenticating users by using Oauth2 Web Flow and Identity Aware Proxy |
7:00 |
24. Reading instance metadata to obtain application configuration |
4:00 |
25. TestTips Oath |
2:00 |
26. 4.3 Integrating Google Cloud APIs with applications. Tasks include: |
1:00 |
27. Enable API |
16:00 |
28. Using Pre Trained ML APIS |
6:00 |
29. Using service accounts to make Google API calls |
3:00 |
30. Using APIs Calls |
1:00 |
31. Making API calls |
8:00 |
32. Using the Cloud SDK to perform Basic Tasks |
8:00 |
33. DLP API |
9:00 |
34. TestTips |
1:00 |
35. Section Review Questions |
5:00 |
Name of Video | Time |
---|---|
1. Objectives 5.1 Installing Monitoring and Logging |
1:00 |
2. Install Stackdriver Monitoring Agent |
1:00 |
3. Objectives 5.2 |
1:00 |
4. Debugging a VM image with serial port |
6:00 |
5. Using the CLI tools |
6:00 |
6. Analyzing a failed VM instance |
1:00 |
7. Sending Logs from a VM |
6:00 |
8. TestTips |
1:00 |
9. Objectives 5.3 |
1:00 |
10. Monitoring Dashboard and Views |
1:00 |
11. Create Monitoring Dashboard Stackdriver |
7:00 |
12. Viewing Logs In Console |
6:00 |
13. Viewing Syslogs from a VM |
4:00 |
14. Streaming Logs |
5:00 |
15. Creating Logging Sinks |
6:00 |
16. Create Custom Metrics |
5:00 |
17. Graphing Metrics |
7:00 |
18. Using Stackdriver Debugger |
8:00 |
19. Review Stack Traces |
8:00 |
20. TestTips |
4:00 |
21. Objectives 5.4 |
1:00 |
22. Setting Up time checks and basic alerts |
6:00 |
23. Troubleshooting Network Issues |
11:00 |
24. API Debbugging |
5:00 |
25. Codelab Networking For Developers |
2:00 |
26. Review App Performance Stackdriver |
11:00 |
27. Troubleshooting Image and OS |
10:00 |
28. Docs and Support |
4:00 |
29. TestTips |
1:00 |
30. Section Review Questions |
4:00 |
Name of Video | Time |
---|---|
1. Case Study Overview |
4:00 |
2. Case Study Sample Questions |
4:00 |
3. TestTips |
3:00 |
Name of Video | Time |
---|---|
1. Pricing Calculator |
3:00 |
2. Qwiklabs |
5:00 |
3. Codelabs Free to Use |
4:00 |
4. Stackoverflow |
4:00 |
5. Project Treehouse |
1:00 |
6. GCP Pricing Calculator |
7:00 |
7. GCP Stencils and Icons |
4:00 |
8. Gcping |
2:00 |
100% Latest & Updated Google Professional Cloud Developer Practice Test Questions, Exam Dumps & Verified Answers!
30 Days Free Updates, Instant Download!
Professional Cloud Developer Premium Bundle
Free Professional Cloud Developer Exam Questions & Professional Cloud Developer Dumps
File Name | Size | Votes |
---|---|---|
File Name google.braindumps.professional cloud developer.v2024-11-03.by.joseph.89q.vce |
Size 912.92 KB |
Votes 1 |
File Name google.realtests.professional cloud developer.v2021-10-15.by.spike.46q.vce |
Size 689.44 KB |
Votes 1 |
File Name google.test4prep.professional cloud developer.v2021-05-05.by.william.46q.vce |
Size 689.44 KB |
Votes 2 |
Google Professional Cloud Developer Training Course
Want verified and proven knowledge for Professional Cloud Developer? Believe it's easy when you have ExamSnap's Professional Cloud Developer certification video training course by your side which along with our Google Professional Cloud Developer Exam Dumps & Practice Test questions provide a complete solution to pass your exam Read More.
Data storage. Now, data storage is, of course, a pretty wide area in Google Cloud. When we consider storage generally, we want to think about, of course, cloud storage. But there are also other types of storage, not only for unstructured data but also for structured data. So we, of course, have a cloud database with SQL cloud storage. BigTable, a BigQuery cloud data store. There are a lot of choices to consider, but one of the first things that we want to think about when we're reading the exam and we're being asked, for example, what type of storage would we want to design into our application or what cloud service should we use to meet the data requirements? One of the things we want to think about is whether the data is structured, unstructured, or, for that matter, semi-structured. We want to first of all consider the structure, then the availability, and also any compliance requirements. For example, do we have compliance requirements that state that the storage has to stay in the US? Of course, if that's the case, we can certainly use any of the services.
In a lot of cases, it's just a question of making sure we deploy in the right region and protect the data appropriately. The third thing we want to think about is a cost model. Again, cost is going to come into play in this, I'm sure. What about performance? Now, BigTable and BigQuery sound the same, but they perform very differently and have very different use cases as well. And also, as a result, the latency is going to be somewhat different, and if we look at cloud storage, for example, versus CloudSQL, again, performance is going to be different.
But with that said, just be aware that latency comes into play as well. When it comes to migration, we also want to be aware of how we're going to migrate the data. When we're choosing a storage option, we want to think about it from the perspective of what service we want to use based on the information that we have—structured, unstructured, et cetera. Now, in Google Cloud, we have our structured services and our unstructured services. Now with structured data services, we have Cloud SQL, BigTable, BigQuery Cloud Data Store, and Cloud Spanner. When it comes to unstructured data services, we have cloud storage and a cloud file store. Now, I just want to be clear on this. That doesn't mean that you can't use cloud storage, for example, for structured data. It just wouldn't be very efficient, and it's not really meant for that. Now, we certainly wouldn't want to use Cloud SQL for blob storage.
That would be a horrible waste of effort and resources, and it probably wouldn't work very well. Now here are some things that we're going to discuss much more in detail throughout the course, especially when we get into the objectives. Part of the course is that we want to be aware of, for example, the use case and what it's good for, and we also want to know the storage type as well.We'll be talking more about these throughout the course. There will be different sections on each of these. But just be aware that, for example, cloud storage is object storage. The cloud data store is really meant to be an SQL or document database. Cloud SQL is your relational SQL, just like Cloud Spanner to a degree. And then BigTable, of course, is not SQL and again has a very different use case. So if we look at the use cases down here, I'm not going to go ahead and read them to you, but what we will do through the case studies, for example, is, when we get to that point, we'll talk more about perhaps what is the right use case and what services we use in the case study questions pretty much near the end of the course. Now, of course, we want to be aware of these specific areas. I talked about these earlier, just as a reminder. But let's talk about structured and unstructured data. When it comes to unstructured data, what is it? Why is it important? If it's data that doesn't have a sequence, doesn't have a strict data model, or a schema, it's generally considered unstructured data. For example, emails, Word documents, et cetera, and social media are going to be your unstructured data. Typically, cloud storage is really the only service built for managing unstructured data. Now there is a file store as well, but that's more network attached storage. So a little bit different use case there, and then structured data. This is going to be your data, which is going to have a strict data model, a schema, if you'll excuse me to that.
Again, it's more of a specific use case. Generally, if you see anything around an Rd BMS or structured data on the exam, it's probably going to fall into a cloud sequel or cloud spanner, with some exceptions. We'll talk about it when it comes up as well. But if you see SQL, then we know that it's probably going to be structured data. Unless they specify new SQL or no sequel again on the exam, they do have a tendency to give you some hints. You just have to read into the question to figure out what the hint is. We'll, of course, have some examples coming up. Now what about semi structured data? This is going to be more tagging. This is going to be focused on a file type like JSON or XML. No sequel is also considered to be semi structured as well. Now this is where we want to go quite a bit with the exam prep. What we want to do is go to the storage options page here and go through some of the decision trees or flow charts that they have or flow charts.
This is more like a decision tree in this example, but basically look at it and try to determine, "Do I understand this or do I not?" If you understand, for example, what structured data is and what services you can use, then you could automatically, for example, go to cloud storage or to Firebase. Now Firebase is focused more on your mobile app environment. That's a different use case. But cloud storage is really where we want to go in most cases. Now the real choice really comes down to: is it structured or is it not structured? Once we get past that, we can then determine: is it analytical? If it's analytical, then we know it's BigTable or BigQuery. And then we need to determine whether this is going to have low latency or not. Also, is the service managed or not? It's basically, are we using this for analytics, or is this a data warehouse? Now, BigQuery is really meant to be a warehouse. This is where we're going to store the data and then ask questions about it later. Big Table is going to be more focused on analysing your data and trying to get immediate value out of it. A little different use case We're going to talk much more about these topics as they come up throughout the course. Alright, now I have several test tips here. The first set is again, we need to know structured versus unstructured data on the exam. You're going to get an example or two that's going to focus on either of these. We want to be aware of what service we're going to have to specify when using Google Cloud.
This is going to be very important, and this is true not just for this exam but for any Google exam. We need to really know what service we should use in what scenario, and we need to be aware of the cost structure we need. Is it structured or unstructured? Is it a data warehouse or is it not? What is the requirement we're going to really need? Now, pub sub—I didn't talk about pub sub. We do have several sections of the course actually on PubSub, but I did want to point out one thing. If we're going to, for example, have analytics or some kind of data pipeline in the cloud, PubSub is really meant for ingesting that volume and dumping it into, for example, BigQuery. Or we could use data flow as well, and we'll talk more about PubSub and why we want to use it. Dataproc is a service again, and this is again a service we'll talk more about if you're not familiar with it. But I just want to point out that if we see anything on the exam that talks about Apache Base Services Hadoop, this is really the only service we're going to use. Now, data flow does have some support in Apache Beam, but it's more on the back end.
It's a little different scenario. I'll talk about data flow in detail as well. Now, I think a lot of the decisions made during the exam do come down to our processing pipelines and what services fit where. Once again, we get into the objectives. You'll see this come to fruition shortly, and then the last set of objective reviews around test tips will be Cloud Sequel and what the database is supported, and then Cloud Spanner and knowing what it is. It is a proprietary service; it is not an open service. There's a lot of special, basically SQL-bear configuration we have to do to make it work, but it's all about the use case, and we'll talk much more about this throughout the objectives. Again, just be aware that Cloud SQL will likely scale basically up and down, whereas Cloud Spanner is meant to scale left or right or horizontally on the exam. Again, we could add more resources to our VMs, but sequel won't really scale out horizontally very well, whereas Spanner is meant to do that on a global scale. Let's move on to the next module. You.
Cloud storage. Well, cloud storage is, of course, our scalable, flexible, and durable object storage that we're going to use in most cases with our virtual machines. Now, we of course can use cloud storage for many different reasons because we connect our app engine to it. Kubernetes, whatever we're trying to accomplish, probably has a good use case. But basically, with cloud storage, it's unified object storage for developers and enterprises to be able to use as a staging area. You can also use that as an archive as well.It's a high performance internet skill as well. And one of the things about it is that it basically presents the objects in a simple way with metadata.
Now, each bucket and object basically has what's called a Uribic, an identifier, sort of like a URL for a website, to get to the object. And this is really the de facto storage on Google Cloud. Now, it's important to realise that cloud storage is not a file system. However, we could certainly use cloud storage, for example, instead of a file system. That way, if for some reason we did want to use cloud storage as a file structure approach for our application, we could use FUSE, but we'd probably be better served when it comes to cloud file storage, for example, which is your network attached storage solution. In Google Cloud, there are APIs that are available: client libraries, rest APIs, as well as, as I stated earlier, gsutil. Now, gsutil is the CLI for cloud storage. It allows us to basically use G Cloud, and it's a subset of the G Cloud basic command-line tool that we can install and basically use to manipulate our storage objects, create buckets, mount our file structure, or whatever you want to do. And we'll have demos on this in the course as well. But for the main purpose, we just want to cover the basics.
Now, basically, cloud storage supports imports both online and offline as well. Now, what's nice is we don't need to have four different APIs to go to four different storage tiers or levels, for example, nearline, cold line, etc. So we don't have to worry about that. Now, there are charges, of course, to use cloud storage, some of it as part of the free tier, some of it.For example, if we're going into Google Cloud, generally it's free if we're in that region. And then, if we're leaving Google Cloud, it's going to cost money. That will vary, of course, based on what you're doing. Now, there's some terminology I just want to clarify for those not familiar with cloud storage. Basically, when we deploy a project, we're going to have that cloud storage basically inside the project. We want to be aware of what projects and, basically, buckets and objects we're deploying as well. In that project, we have to deploy the API separately for each project. Billing, authentication, and monitoring are all handled inside the project buckets, which are the basic containers. So just be aware that buckets will do what they're supposed to do, which is hold our objects fairly straightforwardly. Now objects are going to have two different components to it.
We're going to have data and then metadata, and it's pretty much exactly what it sounds like, right? We're going to have our data, which we're going to store, and then we're going to describe that data. That's our metadata. Some important notes that we want to know for any exam with Google are that cloud storage uses flat namespaces. It is a global namespace, and therefore we need to have unique names across the cloud storage echo system, or the namespace is another way to look at it. There are some more notes we want to be aware of. Cloud Storage offers four storage classes. We're going to talk more about those here in a few seconds. And again, we use the same API for each storage class. There are, of course, charges. Again, some of this is sort of a review, and I'll talk about near-line and cold-line, multi, regional, and regional here shortly. So there are storage classes. We have multi regional, regional, near, and cold lines. Basically, we want to deploy regionally or multiregionally if it's for production storage, and we're going to access it on a fairly routine basis, like every 30 days or so. Now there is a link here I'd recommend you go to that page, which talks about storage classes in more detail.
There is a little bit more on the storage features in the objectives. The main thing to realise for this exam is that they're not really heavily testing you in cloud storage except for perhaps do you deploy regionally or nearline or cold line, and then what are some useful features, for example around object management? I'm going to talk about these right now. The first thing is that we have the ability to maintain our versions. Now let's say, for example, that I go out and I drop an object called, basically, object one. Well, for object one, I could rename it object two, or I could leave it as object one, version one. It's up to me on how I do that, and this allows me flexibility in my applications when I'm designing, for example, a mobile app. Maybe it's a temporary file. I want to store whatever I want to do. It gives me a little bit more flexibility. Then there's lifecycle management. For example, if I don't need to have ten files for more than 24 hours, I could simply create a rule or policy to basically remove any objects that have a dot TMPextension after 24 hours and just delete them. And then what about if I want to be notified if there is a change to a file or object that is again, I could set up a notification, and basically a web hook is what we're creating. It's basically just going to trigger based on a condition, and then I could import as well from cloud storage. And I do have a demo on that as well throughout the course.
So be aware of that coming up. So what is OLM? Well, OLM, in more detail, is going to allow us to set rules and basically determine how to handle that object based on those rules. And then I could notify myself if there's a change. Let's say, for example, that we go to a website, someone logs in, starts looking around, and you get that little pop up with the chat. Well, same approach, right? We're going to go ahead and create a web hook. Let's say, for example, someone drops a file and someone reads a file. I could notify whoever I want to that this object was active or viewed, or whatever I need to do. and then import. Well, import actually allows us to transfer from AWS, from on-premises, or from another Google Cloud bucket. And I'll walk through the demo on this as well. Coming up now, as far as test tips, we'regoing to COVID cloud storage a little bit more. GS: The main thing I want to really keep in mind for this part of the course is just knowing the basics. We want to know some timelines. For example, if it's 30 days or less, we generally want to keep it on multi-, regional-, or regional-level storage.
Now, after 30 days, we may increase our efficiency from a cost perspective. If we move it to nearline storage or after 90 days, we may want to move it to cold line storage, which is archived storage. Now, one thing to point out is that, if you're familiar with AWS, you have Glacier. Well, a glacier is a whole different animal than a coldline. With Google Cloud, all of this is online and available on disk. There are no tape archives to request. You don't have to go request a whole archive to get one file out of 60 GB of data. Again, it's a little different approach. I'll talk more about cloud storage during the objective part of the course. Let's go ahead and move on.
I'm over here at the Google Cloud Platform dashboard. What I'd like to do now is let's go ahead and create a storage bucket, and we'll talk about some of the features and functions that are available when you create a storage bucket in the Google Cloud platform. As with most Google services, there are several ways to get where you want to go. The first way we could go ahead and use is to go ahead over to Products and Services on the left hand side, and we could go over to where it says Storage and select Storage there. Or we go over here to the search capability and put in cloud storage as well. And you can see that all of that brings us over there.
Also, when I type in "cloud storage," you can see that it brings up the APIs for cloud storage. Let's select cloud storage. We are now in the cloud storage dashboard. When we look at the dashboard, we can see that there are already two storage buckets. This is for one of my app engine applications that I created—just a Hello World app. Now you can see by looking at the buckets that are already created that we have a name and the default storage class, which is regional. We have the location, which is the east coast of the United States. We have the lifecycle. Remember, lifecycle is essentially where you could specify specific actions that will basically manage that object. So for example, if you want to downgrade the storage class, you could do that.
Let's say anything over six months or a year is something you want to downgrade or have essentially deleted. Whatever you like to do, you go ahead and create a lifecycle rule for it. Labels are important to create as well, especially in a production environment. This will allow you to search and find and also be able to utilise some of the services more efficiently, so the requester pays well. This one's interesting. Now with this one here, if you set up a cloud storage share, you have the ability to have the requester for those files pay for that service. This is good for a subscription service, for example. Let's proceed over there and take a look at some of the options. On the left side, we have the browser. This is where we're at by default. We have transferred. Now this is a transfer service where you could transfer from Amazon Web Services, essentially, your files that are in s three. Or you could also get them from other buckets. For example, in Google Cloud, let's say you want to do some kind of cross-regional migration. You can do that. Also, you could set up applications in your house to use the service as well. You have a transfer appliance. This is an additional service that you could essentially sign up for and subscribe to. This is essentially meant for transferring large amounts of files.
We're talking generally, you know, about petabytes. In most cases, you could certainly use it for less. But that, of course, is your call. We'll talk more about transferring data in that module. And then, lastly, we have settings. This is where you could specify project access. Remember, too, that when you're creating a bucket, you're generally going to need to be aware of the project that you're in. Remember that these resources are, again, more concerned with where you're placing them, which is the project. And be aware of the project that you're working on. interruptability and interoperability as well. You could see that it picked up the project. Again, this is the default project. This is the one that we're locked into. You can create a new key as well for storage access keys. This is, of course, important when you're creating applications that may want to drop or pickup storage, essentially object storage that's in cloud storage, from, let's say, an external application. You'll probably want to use your own keys. Let's go over to the browser and create a new bucket. When we go over to the right side of the interface, you can see that it says "Browser." We have several options at this point. We have created buckets and refreshed them. Now again, if I hit refresh, nothing happens. But let's say if I select this staging, you can see that I have an option that has now been enabled or highlighted that says delete. In this case, I don't want to delete that. I'd like to create a new bucket. Let's go ahead and select the Create bucket.
Now, as you're aware, there are different storage classes. And again, you want to determine what storage class you want to use for availability, access requirements, and your costing as well. But before we select the storage class, let's go ahead and really talk about bucket names. Now, the bucket name needs to be, again, unique. This is a name that will essentially be unique not only to you but also on a global scale. So when we type in, let's say, "test," you can see that that bucket is already in use by someone. Someone took it. And it also reminds me that bucket names must be globally unique. So just be aware of that. So I'm going to go ahead and call this something that would probably be unique. I'm going to call it that, and it needs to be located as well. Let's go ahead GCP Test Bucket Storage Course Let's go ahead and go for it. And then one. So this will be, again, a fairly simple name: GCP Test Bucket Storage Course One. Now, one of the things I've noticed is that typically, the shorter the name, the more likely it'll be taken. Also, you'll want to pay attention when you create your buckets. You want to identify that bucket in a way that makes sense for your organization. For example, let's say this is for your Hadoop ingestion processes. So let's say you're using DataProc and you want DataProc to again be utilized, but you also want to use other services for big data—Hadoop clusters, whatever. You go ahead and name it after the specific application. For example, a use case could be Big Data One or a mobile application with an Oracle database dump.
Let's just say you're dumping metadata or something; whatever makes sense in your situation. Just say you wouldn't use this, of course, for your Oracle databases. But it could be that you just want to save some redo logs, some backups, or something that you could save essentially as an object essentially. Now a couple of other things are too important to point out before we proceed. Also generally, as a best practice, you want to pay attention to how you name something and try to name it. Typically, at least for me, I like to put numbers because, if you set like object versioning, this will make incrementing easier as well, at least from a visual perspective. And also, this could be used for your life cycle management as well. Once again, you don't need to personally do that. But for me, I just feel it's easier when it comes to the default storage class, multi-regional. This is going to be generally for production data that you really don't want to lose access to, and you want that higher availability and geo redundancy. Essentially multi-, regional-, regional. Again, if you're going to select this, then you're aware that, for example, you could go ahead and set this up in, let's say, Iowa or South Carolina in one of the zones, and you could of course replicate that between those that region and those zones. Essentially, again, in terms of the cost, this is good again for data that is regional, just like it infers, right?
If your customer base is in the US or Canada, then it probably makes sense to go regional. However, if your customer base is more international, then you probably want to choose multiregional.
Once again, this is where you want to plan your storage classes pretty cautiously because, as you're likely aware, the cost for multiregional is somewhat higher than regional. So you're likely going to select regional or multiregional for your production data. Again, the difference between the two really depends on where the use case is. Is it more national or international? Is it distributed or not? Then we have what's called the near line, and we have the cold line. Now for those that are again familiar with Amazon, this is essentially Google's approach to Glacier. Now, with Nearline, this is good. As you're aware, for data that you're not going to use, let's say less than 30 days is usually the rule of thumb because of the cost to get access to the data if you need it. Basically, Google calls it the retrieval of the data, and there is a cost to that same thing with Coal Line. This is, again, if you're going to use this anywhere. So basically, this says less than once per year. Personally, from what I've seen, you could certainly use coal line for, let's say, something six or seven months in a lot of cases. Again. Generally, I like to call it coal. Basically, your tape archive is what it is. You don't want to have to restore this if you don't need to. Of course, it's not personal, but just be aware. It sort of acts like that. Now you can see that when you select the different options, it says location.
Now, pay attention to this as well. If most of your users are in Northern Virginia, then you probably don't want to select Europe. You probably don't want to select the west, for example. As you can see, too, not all zones are covered. Just be aware of that as well. If I select Nearline, this is where you could select the region. If I select multi regional, you can see that it's going to the United States. If I select regional, I'm going to select us. east one, let's say, or let's say I select east for whatever. So let's say it's South Carolina. Or do I want Northern Virginia? So let's just go ahead and go over here. Let's specify labels. Now, labels are really important to identify ahead of time, if you can, because this could make your processes, your management, and your monitoring more efficient. Again, just go ahead and type it in. Let me go ahead and think of something. Type it in here. I guess I could just say, "Test the app bucket," and then I could assign a value. Now, one of the things about selecting a value is that, again, you could select pretty much anything you like, pretty much.You could select a number, you could add a word—whatever you want to do. And then you could see that it added this bucket here with this value. And you keep on adding the key values as you so choose. Let's go ahead and create, and we need to get rid of that. And then I go create. You can see that. Now it brings me over to that bargain I just created.
Now let's proceed and select files to upload files.I'm going to go ahead and select a few pictures. Go open as you can see it's uploading the files, and now it is completed. Let's go ahead and minimise that. You can see that I have JPEGs there. When I select these files, you can see that I have the option enabled to share them publicly. Or I could delete as well. Now, one of the cautions I have to make sure you're aware of is that if you select, for example, "share publicly," this means that this link is open to the public, as you would expect, the public.
This could be a security risk, depending on what you're sharing. If you're just sharing files or pictures that aren't important, whatever, that's great. But certainly be very cautious if you want to make things easier for yourself. Don't share publicly unless you need to. However, there are definitely use cases where you could share files and folders publicly. Whatever you so choose. Just be cautious—that's really just my thought here. Now when you share it publicly, what you're doing is actually creating a link, and you can see that that link is going to be tied to storage, Google APIs, and so on and so forth. It gives you a reference to that photo. And if I unselect it, it will no longer be there. I could go over here as well, and I could go ahead and edit permissions. So if I select Edit permissions, you can see that I can go and select the entity. As you're likely aware, the project is essentially the top level, and then you could also select the domain as well, and so on and so forth, or the user or the group. So select your permissions judiciously; you can also assign reader or owner as well. And again, there is a lot of flexibility in what you can do here. Let's proceed back to buckets. So if I go over to Buckets, you can see that it has my three buckets there. Now what I'd like to do is go ahead and go back and create a folder. Now there are a lot of good reasons to create a folder. One of those is, of course, organization. Another could be for monitoring or for sharing. Once again, be aware of how you organise your data and of the ins and outs of each of these capabilities as well. I want to make sure that you have a good level to mid level overview of each of these.
When we go to create a folder, again, it has to have a unique name. Let's go ahead and type in Test folder 1, and it says that you're creating this in this specific bucket. Let's go create. Now you can see that it says type folder. If I go into the folder, let's say I have specific files that I want to add, and I just go open it, there you go. I've just added those as well. You could see them here; I could share them again publicly. I can then also edit permissions and the metadata as well. I could also move. So what is actually nice about this is—let's say I have specific objects that I like to move from one bucket to another. I could go ahead and select the source, which I just did. So if you just go back, and as I go here, I go to move, it allows me to take the source that I just selected and move that specific file in that folder to, let's say, another folder. So if I go back to the parent folder, I could then select this one here, and it'll tell you because this is actually a different storage class; this is probably multi regional. It won't let you do it. So one of the things that you want to be aware of again is that when you're moving these files, the storage class matteris the options that you set matter.But you can still move this, but you're going to have to use gsutil. So if you copy this, you could go ahead and use gsutil to move that file.
And when we do the demo on this capability, you'll see more about GSU, and you could also select that destination. So in this case, let's just go back, go back to that bucket, and let's select that. Now instead of just leaving it in that folder in that bucket, I'm just going to go straight up and just drop it one level above. Essentially, let's go ahead and move that. Now you can see that that file is now gone from that folder. There were three; now there are two. If I PROCEED back, you can now see that I have that JPEG, which I just moved to. Looks like this one is actually here; I moved it to 140 at P.m. Just a second ago And the storage class is regional.
Now, if I select the file, you can see that that's the container picture, so on and so forth. If I go back here, I could then as well edit the metadata if I want. I could again take that image and add whatever metadata I so choose as well. And then, let's say I wanted to copy it, I could also copy that. I don't have to just move it; I could copy it. And then if I go here to browse, let's go ahead and go back to the test folder. You can see that it's not there. Well, I wonder why. One of the things you want to be aware of is that there could be some latency when you do some of these operations. So if I go back and refresh, this may take a second. Let's go back, and it could take again—it could take ten minutes, it could take a minute. You just never know; a lot of it just depends on the latency. And again, we'll go back and check that in a second now.It looks like yeah. So we finally have a copy there. As you can see, it says "copy of." So now let's just go ahead and delete that. Actually, we need to go up here. Good.
Delete it. Get out of the way. Okay, now that's essentially a very simple process to create a bucket, a folder, and move files around. In the next demos that I have coming up, we're going to talk more about transfer and essentially some of the capabilities that you could use to take AWS and migrate over, or, let's say, your own data centre requirements, assuming that they meet the requirements to move files over from your infrastructure to GCP. So to close it out, let's go back to storage. And you could see that I have my buckets. Let's go ahead and do a quick clean up here. And let's just make sure I'm where I'm supposed to be here. Yeah, right there. And before I delete this, I just want to check a few things. You can see that this lifecycle is enabled. Now, before I delete anything, let's just pay attention to what we delete. Because one of the things that's really easy to do is if you don't organise your files properly, you could certainly accidentally delete your files. Now Google again. They caution you, of course, before you delete something; they remind you, which is great. So in this case here, let's go over to this bucket here. And it looks like we have a little latency going on here. And we did.
And now I'm in that bucket. What I'd like to do now is let's go ahead and delete that folder. And it says, "Do you want to delete it?" And there are two files in there. It says that once you do it, you can't undo it. And yeah, I'm happy to go ahead and get it done. It tells you to delete it. Now let's go ahead and select the remaining files. And as you can see, I could select a folder or whatever number of files I so chose. I could go ahead and delete that as well.
And now I'm back to where I started. Let's go back to the buckets. As you can see, this is what we had beforehand with the exception of this. So guess what we want to do? We want to go ahead and delete that bucket, and we'll delete that one. Now, before you delete any of the buckets, if you have any of your applications using the buckets, just be aware that it does let you. So if I go here to delete, it tells you you can undo this. Be very cautious because if you have applications using this bucket and dropping files into it, and you delete the bucket, guess what's going to happen when they try to drop or retrieve files from the bucket?It's not available. And again, this is where you create some significant I guess the word I want to use is "significant," but there are unavoidable issues when you do delete those files. So that's about all I had with buckets. Let's go ahead and continue on.
Prepared by Top Experts, the top IT Trainers ensure that when it comes to your IT exam prep and you can count on ExamSnap Professional Cloud Developer certification video training course that goes in line with the corresponding Google Professional Cloud Developer exam dumps, study guide, and practice test questions & answers.
Comments (0)
Please post your comments about Professional Cloud Developer Exams. Don't share your email address asking for Professional Cloud Developer braindumps or Professional Cloud Developer exam pdf files.
Purchase Individually
Google Training Courses
Only Registered Members can View Training Courses
Please fill out your email address below in order to view Training Courses. Registration is Free and Easy, You Simply need to provide an email address.
Latest IT Certification News
LIMITED OFFER: GET 30% Discount
This is ONE TIME OFFER
A confirmation link will be sent to this email address to verify your login. *We value your privacy. We will not rent or sell your email address.
Download Free Demo of VCE Exam Simulator
Experience Avanset VCE Exam Simulator for yourself.
Simply submit your e-mail address below to get started with our interactive software demo of your free trial.