Stuff I'm Up To

Technical Ramblings

Laravel, Nuxt.js and Nginx — October 2, 2019

Laravel, Nuxt.js and Nginx

Whilst experimenting with Nuxt.js (A Vue.js framework) as a front end client for Laravel I discovered I was going to face some issues with CORS, certificates for HTTPS and the whole serving the client over port 3000 and the API over port 80 thing.

In the development environment this isn’t so bad as I can run both the Laravel artisan web server and serve Nuxt.js and have them talk to each other – within reason. The problems started when I wanted to use Social Sign In using Facebook, Google etc. The callback from the OAuth process would fire, but the client would fail with CORS errors as I would have to redirect the client using the API from the OAuth callback.

To resolve this issue I tried adding in a CORS module for Laravel and setting the values appropriately, but still failed.

So I began thinking what this would look like in production. I wouldn’t want to serve the API and client separately and I’d probably put them both behind a reverse proxy, so let’s look at using Nginx.

Continue reading
Advertisements
Laravel and Firebase Sync — September 30, 2019

Laravel and Firebase Sync

Today I’ve been working on an idea I’ve had for a while now. Sometimes you want your internal data exposed to the outside world, but don’t really want to open any firewall rules or create a reverse proxy etc.

Typically this could be used for a mobile application or forms product to access data that is created internally, but external users need to use it for form lookups or choices driven by your live system.

For this I looked at synchronising a table with Google Firebase cloud service.

Continue reading
Developing in Windows — September 25, 2019

Developing in Windows

Surely not! Whoever would want to develop software using Windows?

Well over the past week or so I’ve been taking a look at how things would look if I were to develop using Windows as the OS.

There are a few challenges. One of them relating to CRLF vs LF, but there are also a few other issues that add complexity.

For instance, using Nginx and redis server on Windows, isn’t as simple as grabbing them from the apt repository and installing them so they start as a service. Both of these are a little clunky when it comes to setting them up as Windows services, not impossible, but certainly not point and click.

Then what about using different versions of PHP depending upon the project you are working on? Pretty straight forward on Linux, but frustrating on Windows.

That was until I came across Laragon.

Laragon bundles a load of services and programs into a convenient wrapper so you can easily chop and change your development platform to suit your project.

Laragon includes services for web servers, both Apache and Nginx – including SSL/HTTPS support. It includes redis server. Includes the ability to swap PHP versions, run Node.js and provision databases using MySQL.

TailBlazer —
Git Credentials — September 18, 2019

Git Credentials

Using git to push commits up to the remote is all in a days work. The change happens when you switch to a new remote and use a new account.

My first actions where to change the remote for my local project. This is easy enough using git remote set-url origin [url]. It was only when I went to push this project up to the new remote repository that I found I was being denied with a 403 error, which means permission denied.

The big reason for the problem was a change from ssh to https. Using ssh was pretty straight forward, as long as you have your key and it is registered in your .gitconfig for the host your pushing to the credentials are pretty robust.

I’d take a step toward running the remotes on https due to firewall and proxy issues that meant https should be easier.

But because ssh keys can make life easier by not having a key password (cool, unless your user password is weak), the change to https means you need to provide credentials on each push.

This is where you need to start looking at Git Credentials Storage.

Under Linux you can specify a credentials file that will feed your details into the process. The file should be placed somewhere every secure and with the correct permissions to ensure it isn’t misused. For instance as a hidden file under your home directory with nly you having permission to access it.

eg.

$ touch  ~/.my-credentials
£ chmod 600 ~/.my-credentials
$ git config credential.helper 'store --file ~/.my-credentials'

But with Windows things actually get a bit easier! Which is hard for a Linux head to accept :)

The git helper for Windows means that your credentials get stored with your windows account.

$  git config credential.helper manager

Because I changed remotes and changed the account I was using, under Windows I needed to remove my old credentials. This is easy enough. I just brought up the start menu and type “credentials“. Then I chose the option for “Manage Windows Credentials“. In the list of generic credentials I could see my old account and simply removed it. The next time I pushed I was asked for new credentials which then got added into the list for me.

VSCode CRLF vs LF Battle — September 4, 2019

VSCode CRLF vs LF Battle

I’m a Linux guy. I like my line feeds a simple LF. but when developing cross platform and you hit Windows and face CRLF. It can be a real linting challenge.

Git tries to be helpful in that it translates LF to CRLF when you pull onto a windows platform. But that doesn’t help at all when your projects .eslintrc.js is set for unix type line endings.

      "linebreak-style": [
        "error",
        "unix"
      ], 

Changing CRLF to LF in VSCode is easy enough, but having to do it on every file you open is madness.

Continue reading
Laravel and O365 Authentication —

Laravel and O365 Authentication

Our app currently uses LDAP authentication but as our environment is rapidly moving onto the cloud and Microsoft Office 365 it’s time to investigate authentication using O365, more specifically Azure Active Directory.

Getting this going was actually more straight forward than I expected. Laravel already has an authentication provider for OAuth2 called Socialite. Once installed I needed to add in the ‘microsoft-graph’ driver.

The key piece of the puzzle is here: https://socialiteproviders.netlify.com/providers/microsoft-graph.html

Continue reading
PHPUnit Testing Failure — August 14, 2019

PHPUnit Testing Failure

Today’s challenge caused me to burn a lot of time before resolving the issue, which turned out to be an obvious mistake.

When I run a unit test I need to add a number of test rows to the database using the Faker that is built into Laravel. So I added this into my setUp() function.

When the test is finished I need to tidy up and remove the rows I added which I put into my tearDown() function.

The first thing done inside a function override is call the parent function so we call parent::setUp() and parent::tearDown() respectively. Almost all my test are like this. But when I reached one of them it failed and I could not figure out why.

PHPUnit 6.5.14 by Sebastian Bergmann and contributors.

 E                                       1 / 1 (100%)

 Time: 321 ms, Memory: 30.00MB

 There was 1 error:

 1) Tests\Unit\Finance\CostCentreOwnersTest::testApiGetCostCodes
  ReflectionException: Class config does not exist
... 
  /…/app/Observers/CostCentreOwnersObserver.php:78
  /…/app/Observers/CostCentreOwnersObserver.php:69

I refactored a lot of code trying to find this issue. The light bulb moment was seeing only one of my deletions happening in the tearDown(). I then spotted the CostCentreOwnersObserver in the errors.

Of course! I’m using an Eloquent event trigger within the Observer and that is now failing. It’s the only difference to the other tests – they don’t have events.

I realised my simple mistake. I was calling the parent::tearDown() before I was deleting my test rows. So the actual failure was relating to me destroying the environment before I’ve finished with it.

Because the other test models don’t have Eloquent events the parent::tearDown() being in the wrong order didn’t affect them.

TL;DR

The Moral of this story is make sure your call to parent::tearDown() is the last thing in your tests tearDown() function.

Guzzle and Curl — August 12, 2019

Guzzle and Curl

Related to my previous post about Laravel. Guzzle and Nginx I ran into an issue with our proxy. The proxy is always a source of fun and games.

Because the proxy breaks open SSL traffic to scan the content the clients are required to have an SSL certificate installed that tells them to trust our proxy server certificate. In Windows and Linux you can insert the CA cert into the OS using group policy or writing it into the certificate store.

Curl uses it’s own certificate store so we needed to copy the proxy CA cert into the curl store.

On Windows there wasn’t a certificate store. I created one in a location that would remain even if anything was updated or moved.

Download the cacert.pem file and place it in c:\certs. Then I just added my proxy cert in PEM on the end.

C:> type proxy.pem >> c:\certs\cacert.pem

Edit your php.ini and change the curl setting to point at the new cacert.pem file

[curl]
 curl.cainfo = c:/certs/cacert.pem

You can find what php.ini you are using with:

C:> php --ini
Configuration File (php.ini) Path: C:\windows
 Loaded Configuration File:         C:\tools\php73\php.ini
 Scan for additional .ini files in: (none)
 Additional .ini files parsed:      (none)

Restart any php service, like Apache, Nginx, Artisan, etc. and curl should then trust the proxy server.

Chocolatey Proxy —

Chocolatey Proxy

I was tidying up another PC today and came across an annoying issue that I couldn’t resolve. It took me a while, reinstalling, uninstalling choco etc. and still not getting to the bottom of it.

When I ran choco from the PowerShell command line I got asked for my proxy credentials and I could use the CLI. But every time I started Chocolatey GUI I’d get an error:

System.InvalidOperationException: Cannot read keys when either application does not have a console or when console input has been redirected from a file.

I had a light bulb moment in that this meant the GUI was waiting for an input of my user name and password to get through the proxy.

The solution was to use the CLI to set the proxy and credentials.

choco config set proxy 
choco config set proxyUser  #optional
choco config set proxyPassword  # optional

The the GUI fires up and I can update and install apps.

References: https://warlord0blog.wordpress.com/2019/02/27/chocolatey-package-manager/

OverAPI.com — August 8, 2019
Laravel, Guzzle and Nginx — August 7, 2019

Laravel, Guzzle and Nginx

After deploying a working test into our pre-production environment the Guzzle API calls we were making to fetch bank holiday data from the .gov.uk site starting failing.

Error creating resource: [message] fopen(https://www.gov.uk/bank-holidays.json): failed to open stream: Connection timed out
 [file] /var/www/itsmpreprod/vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php
 [line] 323 {"exception":"[object] (GuzzleHttp\Exception\RequestException(code: 0): Error creating resource: [message] fopen(https://www.gov.uk/bank-holidays.json): failed to open stream: Connection timed out

I spent quite some time trying to figure out what the issue was. Nginx would return a 504 Gateway timed out message.

It transpires that Guzzle is pretty smart in what it does. It’s capable of using various types of calls to retrieve the data. In a development environment, under the artisan server, it was happy using tcp sockets, but once on the server under Nginx it tries to use curl.

I got my clue from here:

https://github.com/guzzle/guzzle/issues/1841#issuecomment-341395019

It looks like the issue was just because we hadn’t got the php module php7.0-curl installed!

$ sudo apt install php7.0-curl

Once installed we had to change the proxy scheme from tcp to http and the calls then worked as expected.

    public function getApiData()
    {
        $client = new Client();
        $res = $client->request(
            'GET', 'https://www.gov.uk/bank-holidays.json',
            [
                'proxy' => 'http://proxy:port'
            ]
        );
        return $res->getBody()->getContents();
    }