We’ve had a vsftpd server for a while and it’s performed very well for us. But it would appear that it’s not actively maintained. This may not be a problem as it still currently works just fine and we don’t have any obvious vulnerabilities with it, but as the OS it’s running on is Wheezy we need to move on at least up to Stretch. So I figured I’d try deploying a new server but configured with proftpd.
We’ve been using Azure for a few months now so it’s about time our certificates would expire right? Well according to the email notification we’ve just received a certificate needs updating or we’ll lose access!
In order to provide your organization with uninterrupted access to Office 365 and Microsoft Azure Active Directory (Azure AD), you need to ensure your certificate for the domain(s) domain.tld is renewed and updated in Azure AD right away.
Our current certificate on file for domain(s) domain.tld expires on 5/5/2018.
If you don’t take action, your users will lose access on this date or, in the default configuration of Active Directory Federation Services, 15 days prior to 5/5/2018.
What you should do right now
If you are using AD FS with the default configuration, or are using a third party STS or a non-default configuration of AD FS, follow the article here.
I’ve kicked this around a few times and resigned myself to just using the non-DFS path to attach to. But we’ve recently changed some of the servers around and the paths have changed – obviously the DFS paths haven’t. So I thought I’d have a go at fixing the problem.
Ok, so I know Microsoft have been making some big steps in the world of Open Source – I confess to giving them little ear time, mainly because EVERYTHING we seem to do at work ends up a licensing battle that Microsoft always win, and by win I mean empty this years budget in one go.
I’ve been using Atom as my editor of choice for some time now and really like it, but someone suggested I checkout Microsoft Visual Studio Code. I am aware of it. It has the same construction as atom – it’s based on Node.js and built using electron, just the same. So I installed it today… and wowsers! It’s impressive. It uses
git and picks up the file changes using git status and I’d almost swear it was atom.
It even has extensions for linting! Of course it also has a PowerShell extension – which to be honest I found missing in atom when coding PS in Linux.
But I’m not 100% sold yet. Atom is going to take some beating. But I’m certainly more surprised by code than I thought I would be.
Our GIS team use a PostgreSQL server with PostGIS. They recently asked if there was any way we could display some data in a simple web form for our users. So a bit of development work was required.
I didn’t want to code against their live system so thought I’d install a local version of PostGIS and copy the data from their database.
The database they wanted to access has 28 million rows – so it’s going to take a while.
This could have saved me some time today!
I’m new to PowerShell and this was hard work to find that it’s not my problem.
$enc = [System.Text.Encoding]::GetEncoding(28591) # iso-8859-1 # Using StringBuilder to form a valid XML file with encoding declaration $builder = New-Object System.Text.StringBuilder $stringWriter = New-Object System.IO.StringWriter($builder) $settings = New-Object System.XML.XmlWriterSettings $settings.Encoding = $enc $settings.Indent = $true $settings.CloseOutput = $false $settings.CheckCharacters = $true Write-Host $settings.Encoding $writer = [System.XML.XmlWriter]::Create($builder, $settings) Write-Host $writer.Settings.Encoding $xmlDoc.AppendChild($xmlData) $xmlDoc.Save($writer) # Save the XML into the $writer object $writer.Close() $stringWriter.Dispose()
I created a settings object to set the encoding I wanted and then passed that into
XmlWriter::Create but it gets totally ignored and reverts back to default UnicodeEncoding.
So the settings value is
Latin1Encoding, but once it reaches the other side of the
XmlWriter::Create it has become
The result is every XML file I create has
encoding="utf-16" in the declaration.
Been a long while since I used VBScript – I guess I should get more familiar with PowerShell, but there’s a simple project that requires a script change that reads a colon delimited file and converts it to comma separated.
The original script read the file as a text file one line at a time and split the line into an array using the colon delimiter. Nothing too wrong with this, but the file is fixed length and loads of dead spaces to trim out, date conversions to be done and some special currency handling. So I thought I’d use an OLEDB method to read the data as a recordset.
Should be simple enough right?
Set cn = CreateObject("ADODB.Connection") Set rs = CreateObject("ADODB.RecordSet") cn.Open "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\temp;Extended Properties=""text;HDR=No;FMT=Delimited(:);""" rs.Open "select * from file.20180131;", cn, adOpenForwardOnly, adLockReadOnly, adCmdText
Well not really. The biggest problem I encountered was the file names in use are suffixed with the date eg.
filename.20180131. That shouldn’t be a problem, but the OLEDB text handler fails with anything but .txt, .asc, .csv extensions. Worse still the error messages it comes back with are terrible.
First off I got messages about the file being read only, which it isn’t.
Microsoft Access Database Engine: Cannot update. Database or object is read-only.
Then adding in a
ReadOnly=False to the connection string only made things worse!
Microsoft Access Database Engine: Could not find installable ISAM.
All because the extension needs to be .txt!
Then I finally get it working by using the .txt extension it reads the entire line into a single field/column. It ignores the
FMT=Delimiter(:)in the connection string. This is because it doesn’t work like that anymore. You MUST create a
schema.ini file in the same location as your text file and configure the options that way.
[file.txt] ColNameHeader=false CharacterSet=ANSI Format=Delimited(:) CurrencySymbol=# Col1=A Long Col2=B Long Col3=C Text Col4=D Currency Col5=E Date
Our file includes a # instead of a GBP £ sign so we can even fix that in the
schema.ini file by telling it to see the # as a currency symbol.
So we fix the VBScript and now we can read the data without messing with split(), arrays and data conversion.
Set cn = CreateObject("ADODB.Connection") Set rs = CreateObject("ADODB.RecordSet") cn.Open "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\temp;Extended Properties=""text;""" rs.Open "select * from file.txt;", cn, adOpenForwardOnly, adLockReadOnly, adCmdText
OLEDB Drivers without installing Office
You don’t need office on your server to read data using OLEDB. You can just install the Microsoft Access Database Engine 2016 Redistributable
So today’s been the first day following the consultants departure. They configured our Exchange 2013 estate to act as a hybrid solution to allow us to migrate our mail box users onto Outlook 365.
The config and setup certainly seemed more straight forward on the cloud side than the “on premise” parts. We had plenty to do to setup autodiscover DNS records internal and external, reverse proxying and ActiveSync setups with Sophos Mobile Control.
But now the consultants have gone we’re left picking up the pieces. As it seems no job is left finished.
Up until this week we’ve been able to get away with a very simple SMC installation that proxies Exchange ActiveSync (EAS) from the one server with the base Sophos Mobile Control program without using a Standalone EAS Proxy.
But now we’re moving towards Office 365 on the cloud the Microsoft ActiveSync gets messy. As we’re in a hybrid setup where we have most users mailboxes on an internal Exchange 2013 instance and only a few on Office 365 the EAS Proxy part of SMC needs to know about more than one server/service to proxy to.
Whilst trying to add a new cluster for file shares to take over from the previous one we found that whilst replication worked to migrate the files, we could not remove or disable the old paths from the Folder Targets.
Access Denied – obviously some kind of permission issue, but try as we might comparing ACL’s between systems we couldn’t see where the issue was.
It all came down to the power of my Google Fu.
This wound me up this week. Every time I tried to open an Explorer instance to view some files I’d have to wait what seemed like an eternity before the window opened. It must have been about 30 seconds, maybe longer.
Ultimately it turned out to be a problem of my own making – kind of.
I’d repeatedly visited a Samba/CIFS share on a virtual Linux box I’ve been working on. Windows decided to add the share to my “Quick Access” list. But because the virtual box isn’t always on, the share wasn’t accessible and so explorer would have to wait for it to time out before showing me my C: drive.
Just clear the “not so” Quick Access list and presto, Explorer is back to opening quickly again.
Press Windows Key (or open Start Menu), type “folder” and open the “File Explorer Options” that are listed. Then click the “Clear” button under Privacy to get things back to as they should be.
I Googled plenty that recommended MSCONFIG and stopping services like Windows Search and Cortana, adding Registry Keys and other nonsense. When all it was is a Quick Access entry.
Time to replace the PaperCut web server certificate. So pleased I ran into Keystore Explorer previously as this made changing the web server certificate a breeze.
Put simply you create a new keystore file, in the
Program Files\PaperCut MF\server\custom folder, and import your certificate that you obtain from your internal CA. We did this using MMC and the Certificate snap-in on the print server. Then export the certificate with private key to a
.pfx file. Then just import the
.pfx into the new keystore in Keystore Explorer.
server.properties file in
Program Files\PaperCut MF\server and add the relevant keystore and password details.:
### SSL/HTTPS Configuration (Default: 9192) ### server.ssl.port=9192 # Custom SSL keystore example (recommend placing in the custom directory) server.ssl.keystore=custom/my-ssl-keystore server.ssl.keystore-password=myPassword server.ssl.key-password=myPassword
Restart the PaperCut services, give it a minute and the user and admin portal should now be using the new certificate.
Now every printer that has an embedded PaperCut app will need to be updated to accept the new certificate. This means you have to visit each PaperCut admin console on every device – yes, that’s the painful bit if you have a lot of printers. Then you login to the console and click apply, even though you’ve made no change. This will then ask you to accept and trust the new certificate.