Stuff I'm Up To

Technical Ramblings

PHPUnit Testing Failure — August 14, 2019

PHPUnit Testing Failure

Today’s challenge caused me to burn a lot of time before resolving the issue, which turned out to be an obvious mistake.

When I run a unit test I need to add a number of test rows to the database using the Faker that is built into Laravel. So I added this into my setUp() function.

When the test is finished I need to tidy up and remove the rows I added which I put into my tearDown() function.

The first thing done inside a function override is call the parent function so we call parent::setUp() and parent::tearDown() respectively. Almost all my test are like this. But when I reached one of them it failed and I could not figure out why.

PHPUnit 6.5.14 by Sebastian Bergmann and contributors.

 E                                       1 / 1 (100%)

 Time: 321 ms, Memory: 30.00MB

 There was 1 error:

 1) Tests\Unit\Finance\CostCentreOwnersTest::testApiGetCostCodes
  ReflectionException: Class config does not exist

I refactored a lot of code trying to find this issue. The light bulb moment was seeing only one of my deletions happening in the tearDown(). I then spotted the CostCentreOwnersObserver in the errors.

Of course! I’m using an Eloquent event trigger within the Observer and that is now failing. It’s the only difference to the other tests – they don’t have events.

I realised my simple mistake. I was calling the parent::tearDown() before I was deleting my test rows. So the actual failure was relating to me destroying the environment before I’ve finished with it.

Because the other test models don’t have Eloquent events the parent::tearDown() being in the wrong order didn’t affect them.


The Moral of this story is make sure your call to parent::tearDown() is the last thing in your tests tearDown() function.

Guzzle and Curl — August 12, 2019

Guzzle and Curl

Related to my previous post about Laravel. Guzzle and Nginx I ran into an issue with our proxy. The proxy is always a source of fun and games.

Because the proxy breaks open SSL traffic to scan the content the clients are required to have an SSL certificate installed that tells them to trust our proxy server certificate. In Windows and Linux you can insert the CA cert into the OS using group policy or writing it into the certificate store.

Curl uses it’s own certificate store so we needed to copy the proxy CA cert into the curl store.

On Windows there wasn’t a certificate store. I created one in a location that would remain even if anything was updated or moved.

Download the cacert.pem file and place it in c:\certs. Then I just added my proxy cert in PEM on the end.

C:> type proxy.pem >> c:\certs\cacert.pem

Edit your php.ini and change the curl setting to point at the new cacert.pem file

 curl.cainfo = c:/certs/cacert.pem

You can find what php.ini you are using with:

C:> php --ini
Configuration File (php.ini) Path: C:\windows
 Loaded Configuration File:         C:\tools\php73\php.ini
 Scan for additional .ini files in: (none)
 Additional .ini files parsed:      (none)

Restart any php service, like Apache, Nginx, Artisan, etc. and curl should then trust the proxy server.

Chocolatey Proxy —

Chocolatey Proxy

I was tidying up another PC today and came across an annoying issue that I couldn’t resolve. It took me a while, reinstalling, uninstalling choco etc. and still not getting to the bottom of it.

When I ran choco from the PowerShell command line I got asked for my proxy credentials and I could use the CLI. But every time I started Chocolatey GUI I’d get an error:

System.InvalidOperationException: Cannot read keys when either application does not have a console or when console input has been redirected from a file.

I had a light bulb moment in that this meant the GUI was waiting for an input of my user name and password to get through the proxy.

The solution was to use the CLI to set the proxy and credentials.

choco config set proxy 
choco config set proxyUser  #optional
choco config set proxyPassword  # optional

The the GUI fires up and I can update and install apps.

References: — August 8, 2019
Laravel, Guzzle and Nginx — August 7, 2019

Laravel, Guzzle and Nginx

After deploying a working test into our pre-production environment the Guzzle API calls we were making to fetch bank holiday data from the site starting failing.

Error creating resource: [message] fopen( failed to open stream: Connection timed out
 [file] /var/www/itsmpreprod/vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php
 [line] 323 {"exception":"[object] (GuzzleHttp\Exception\RequestException(code: 0): Error creating resource: [message] fopen( failed to open stream: Connection timed out

I spent quite some time trying to figure out what the issue was. Nginx would return a 504 Gateway timed out message.

It transpires that Guzzle is pretty smart in what it does. It’s capable of using various types of calls to retrieve the data. In a development environment, under the artisan server, it was happy using tcp sockets, but once on the server under Nginx it tries to use curl.

I got my clue from here:

It looks like the issue was just because we hadn’t got the php module php7.0-curl installed!

$ sudo apt install php7.0-curl

Once installed we had to change the proxy scheme from tcp to http and the calls then worked as expected.

    public function getApiData()
        $client = new Client();
        $res = $client->request(
            'GET', '',
                'proxy' => 'http://proxy:port'
        return $res->getBody()->getContents();