Let's use Google Cloud PHP Client, which allows you to easily operate various GCP services.
Hello.
I'm Mandai, in charge of Wild on the development team.
Like AWS, GCP also has client libraries that support various programming languages.
This is a convenient alternative, as you can use the same method to access various GCP services.
Also, since it is integrated, the usability does not change between services, so once you learn it, you will never get lost.It is a tool that will benefit you the more you use it.
This time, I will try using the PHP client library.
Install using composer
It can be installed with composer, so it is very easy to install.
GCP's client library is modularized for each service, so just install the library for the service you need.
The example below installs a library for Cloud Storage.
composer require google/cloud-storage
The installation is now complete.
After that, you can use the service by creating a service account (described later) using the authentication information.
There are other libraries for each service, so I have summarized them in the table.
Service name | module name | remarks |
---|---|---|
Cloud Storage | google/cloud-storage | |
Cloud Datastore | google/cloud-datastore | |
Cloud BigQuery | google/cloud-bigquery | |
Cloud Spanner | google/cloud-spanner | Beta version |
Cloud Vision | google/cloud-vision | |
Cloud Translate | google/cloud-translate | |
Cloud Speech | google/cloud-speech | |
Cloud Natural Language | google/cloud-language | Some features are in beta version |
Google App Engine | google/cloud-tools | Work with Docker images for Flex environments |
Cloud Pub/Sub | google/cloud-pubsub | |
Stackdriver Trace | google/cloud-trace | |
Stackdriver Logging | google/cloud-logging | |
Stackdriver monitoring | google/cloud-monitoring | |
Stackdriver Error Reporting | google/cloud-error-reporting | |
Video intelligence | google/cloud-videointelligence | Beta version |
Cloud Firestore | google/cloud-firestore | Beta version |
Cloud Data Loss Prevention | google/cloud-dlp | Data Loss Prevention API Early Access Program participants only |
Bigquery Data Transfer Service | google/cloud-bigquerydatatransfer | Private free trial version |
- All can be installed and used from composer.
- Data is as of January 18, 2018
Create a service account
Create a service account from the GCP cloud console.
Transition from the menu to the authentication information screen.
Next, select the service account key from the "Create Credentials" button.
If you want to create a new service account, select "New service account".
You can use any service account name you like that is easy to manage.
There are multiple roles depending on the service, so there is no single role, but "Owner/Administrator" is the role with the highest authority and has the authority to perform all operations within the service. If you're having trouble, this is it.
For accounts issued to developers, set a role with read and write privileges.
By the way, you can have multiple roles.
For example, for Datastore, there are users and index administrators.
The service account ID has a format similar to an email, and the part before the @ can be set to any string of your choice.
As the key type is P12, it is difficult to use from PHP, so select "JSON".
After entering the above information, press the Create button.
Once created, a JSON file containing authentication information will begin downloading.
This JSON file is very important, so handle it with care.
The creation of the service account is now complete.
sample program
I remember having trouble with the sample program listed in the GCP documentation because it didn't include the part to read the authentication information, so assuming that you are using Cloud Storage, you should actually upload the image from the authentication information settings. A sample program up to this point is presented below.
<?php require './vendor/autoload.php'; use Google\Cloud\Core\ServiceBuilder; $keyFilePath = '../hogehoge.json'; $projectId = 'sample-123456'; $bucketName = 'new-my-bucket'; $uploadFile = './test.txt'; $gcloud = new ServiceBuilder([ 'keyFilePath' => $keyFilePath, 'projectId' => $projectId, ]); $storage = $gcloud->storage(); $bucket = $storage->createBucket($bucketName); // When using an existing bucket // $ bucket = $storage->bucket($bucketName); $bucket->upload(fopen($uploadFile, 'r'));
The above sample uploads a file called test.txt to the new-my-bucket bucket.
here and there , the ServiceBuilder class is not used, but personally, I think that if you use the instance created from the ServiceBuilder class, you can retrieve the instance for each service that has been authenticated. I think it's quick.
The GCP authentication documentation says to use the environment variable "GOOGLE_APPLICATION_CREDENTIALS" and type the command, but I think it would require less preparation if it was completed with just PHP, so the method I introduced here is for personal use only. This is my favorite method.
summary
There are only benefits!
What do you think of the GCP service client library that makes you want to say that?
I thought it would be nice if there was a client library for Cloud SQL, but it seems like there isn't one.
There isn't anything for GCE, but I don't think there's any benefit to using PHP for VM management in the first place.
I think the point is to simply use the gcloud command.
There are clients for services that are not even available to the public, such as Cloud Data Loss Prevention and Bigquery Data Transfer Service, but I think the library development speed is incredible.
In particular, among GCP's storage services, Cloud Storage is by far the cheapest even for the highest-end multi-regional service (65% of the price of standard persistent disk, which is thought to be the next cheapest), so we recommend using a system that is planned to be deployed on GCP. If so, I would like to actively use it without saving it locally from the program.
That's it.