Hashemian Blog
Web, Finance, Technology, Running

The Long, Hard and Possibly Foolish Path to SSL/TLS Security

by @ 10:57 pm
Filed under: internet,web — Tags: , , , , , ,

... or TLS 1.2 on Fedora Core 14/FC14 and other older Linux versions

With the chorus of secure browsing  getting louder and becoming more prevalent,  HTTPS migration is becoming inevitable. Going secure is a pretty major undertaking, fraught with numerous pitfalls. It starts with the source files that produce the html pages and it could get ugly if there's even one element in a page that is called over http rather than https, no green padlock in that case. The Protocol-relative URL (//) instead of the hard-coded http:// or https:// is quite helpful to that end. That's one of side of the equation. The other is the server itself.

I run my site on an older server with an old Fedora Core OS (FC14) and by extension an older version of web server software, Apache 2.2.17. Over the years I have updated a few components here and there and fixed and customized a bunch of others, especially after new vulnerabilities have popped up. Updating the server to the latest and greatest version would be a non-trivial task for me. The old hardware may not be sufficiently supportive, much of the OS customizations will be lost, migrating the data and config files will be a pain and there will be downtime as well. Yet FC14 cannot support the newer and safer SSL/TLS technologies considered acceptable by today's browsers.

At my day job, I have access to resources to overcome this problem by fronting the web server with other servers running newer technologies. For example a combination of HAProxy and Varnish provides excellent web acceleration, load balancing, and SSL termination without making any updates to the core web server. No such luck for a small time operator such as myself with limited resources, so what to do?

One approach would be to only upgrade parts of the OS and Apache (httpd program) that deal with encryption but there isn't much in terms of online resources dealing with this topic other than the customary advice to upgrade the OS. In the end this became a long process of trial and error but it was a successful endeavor with a good bit of leaning as a bonus. Here's how I did it.

Apache 2.2.17 running on Fedora Core 14 can be configured for SSL, however it can only provide support for up to TLS1.0, with older cipher suites and the weak RSA key exchange. I had already patched OpenSSL after the Heartbleed bug had become public but what I needed were newer version of libcrypto.so.1.0.0 and libssl.so.1.0.0 libraries used by mod_ssl.so, a module used by httpd to enable SSL.

I downloaded the source and built OpenSSL version 1.0.1u. Building applications from source code in Linux is usually a three-step process, configure, make, and make install. After 'make install' The new OpenSSL libraries were placed in an alternate directory, /usr/local/ssl, instead of overwriting their main system counterparts.

Next step was to incorporate the new libcrypto.so.1.0.0 and libssl.so.1.0.0 libraries into mod_ssl.so and my tool of choice for that was patchelf. I downloaded and built patchelf 0.9.

Before using patchelf I took one step which I am not sure if it were necessary in hindsight. That step was adding the location of these new libraries to a conf file under /etc/ld.so.conf.d and executing the ldconfig command to add this new location to the library cache /etc/ld.so.cache used by the linker.

Here's an example of a command I used to replace one of the libraries in mod_ssl.so, I did the same for the other:

./patchelf --replace-needed libcrypto.so.1.0.0 libcrypto.so.1.0.0 mod_ssl.so

Then I used the ldd command to make sure mod_ssl.so was now linking to these new libraries.

Following an httpd restart and checking one of the secure pages I had the encouraging green padlock on Chrome browser. Indeed the page was using TLS1.2 now with a strong encryption/cipher. Yet the key exchange was using the obsoleted RSA. The new OpenSSL libraries had certainly made an improvement to mod_ssl.so but the strong key exchange element was missing.

To overcome that issue I had to rebuild mod_ssl.so with a newer version of Apache source code. That was version 2.2.32. configure was done with the following parameters:

./configure --enable-mods-shared="ssl"  --with-ssl=/usr/local/ssl

And after make I found the new mod_ssl.so in one of the subdirectories of the source files. I skipped the make install step to avoid possible complications of the new version installing itself in the system. Interestingly the new mod_ssl.so was already linked to the new libcrypto.so.1.0.0 and libssl.so.1.0.0 libraries I had created in the previous step. I suppose adding that directory to the library cache had helped with that.

I placed mod_ssl.so in the folder to be loaded by /etc/httpd/conf.d/ssl.conf and restarted httpd. And it failed to start with a message about a missing symbol ap_map_http_request_error! Obviously mod_ssl.so couldn't call into this function of the older httpd (or some library) version.

To fix that error I edited the file modules/ssl/ssl_engine_io.c and replaced the line:

return ap_map_http_request_error(rv, HTTP_INTERNAL_SERVER_ERROR);

with

return HTTP_INTERNAL_SERVER_ERROR;

I admit, this is a blind alteration with possible adverse repercussions, so I don't vouch for it. Executing make once again yielded a new mod_ssl.so and this time httpd started just fine, this time with a strong key exchange added in. Testing the site with SSL Labs gave additional confirmation that SSL encryption was indeed working fine.

If wondering, I use Let's Encrypt for free SSL certificates. The recommended utility for obtaining certificates is certbot but that tool with its overly complex and finicky python virtual environment wouldn't work under FC14. The tool that worked beautifully was getssl. It's a simple and clean, yet powerful and flexible script written as one executable file in bash script. Kudos to the getssl team for creating this robust tool.

So there you have it for enabling modern SSL/TLS in an older environment, in this case FC14. The prevailing wisdom is to abandon the old OS and start off fresh with a newer platform. I don't disagree with that philosophy and I set out to do just that when I started on this journey. In the end, my way was possibly more difficult and more prone to pitfalls but ultimately it ended up being more satisfying and more instructive.

I haven't migrated the entire site to HTTPS yet, but you can click secure whoami to view and examine the first secure page.

 

HTTP to HTTPS Migration

by @ 10:26 pm
Filed under: google,internet,web — Tags:

https-ssl-tlsA universally secure internet may have its defenders and detractors but like it or not, Google is going to force site encryption (https) across the board.

First it was the SEO penalty threat, supposedly giving higher scores to secure sites but it doesn't seem like that worked out great. I think Google recognized that just giving prominence to secure sites over plain ones might lead to low quality sites stealing rankings from reputable ones simply by going encrypted. That would have meant poor search results pages, possibly alienating users and driving them to competitors such a Bing.

Now Google is coming at this from another angle, the Chrome browser and this one may stick. As Chrome has the biggest browser market share on the market, they can shame non-encrypted sites right from the browser rather than jeopardizing the Google search engine money machine.

Beginning January 2017 Chrome will print a timid 'Not secure' next to a plain page's URL indicating it is not encrypted. But that is just the start. The plan is to make the label bolder and more colorful with the future versions of Chrome. I suspect that at some future point Chrome may require users to jump through warning messages to show a plain page. That would be much like the cumbersome steps needed today to show a page when browsing to a secure page with a broken or invalid certificate.

The process of migration from a plain site to an encrypted site starts with obtaining a site certificate. This used to be an expensive proposition but nowadays a basic one can be had for free. In terms of the web server there are 3 ways to migrate a site from plain to secure:

1- In-place migration of the web server application - Just about any web server on the market today can handle secure connections as well as plain ones. The process generally involves installing the certificate, making some configuration changes and the site goes encrypted. Servers with multiple domains may however need an upgrade. For that, check for SNI support. For example Microsoft's IIS below version 8 does not support SNI. And if you have users that are still on Windows XP, good luck. SNI isn't supported on that platform at all.

2- Using an https appliance - Here the web server infrastructure is left intact but instead it is fronted by another server or service known as an https appliance or SSL termination. There are many such appliances on the market that are relatively easy to set up. There are also open source products such a Nginx or HAProxy that require a bit more tech know-how. In both cases they are deployed by installing the corresponding domain certificates and exposing them to the internet traffic. Internally they access the actual web server via plain http and return the page to the users encrypted over https.

3- Using a CDN - This is similar to the 2nd method, except that the appliance is actually managed by another company, like CloudFlare (free), Akamai or  CloudFront among others, in the cloud. The benefit is that little administration is required and in some  cases, like CloudFlare, even the certificate is pre-handled. The downside is giving up a certain level control and trust which a business may not be comfortable with.

Going https is not a trivial task, specially for the less tech savvy. But at least there are a number of available migration choices, each with a number of product options. They have various degrees of convenience, efficiency, and precision but eventually one must be chosen as the https migration seems inevitable. How would this site migrate to https? Remains to be seen.

Online Ad to Block Online Ad

by @ 9:33 pm
Filed under: web

An online ad on YouTube for an application to block online ads. Oh the irony 🙂

online ad for no online ad

The Great SYN Flood of China

by @ 9:55 pm
Filed under: hacking,web — Tags: ,

china-syn-flood

I wake up yesterday morning and while still in bed I get the dreaded site-down alert from Pingdom on my smartphone. When a Web site goes down there are a number of simple preliminary steps one takes to pinpoint and fix the problem. Is the ISP having an outage? Are the modem and router up? Is the server up and is the Web service running?

The server was up but the Web service was unresponsive. The quick and dirty steps are restart the Web service, no dice there. Ok, reboot the server, still no good. It was time do drag myself over to my desk and login to the Linux server to investigate more. Going through a bunch of diagnostics steps, this is what I saw:

syn_recv

The Flood of The SYN-tury

Those familiar with the TCP handshake know that the session setup consists of a SYN packet from host to server, followed by a SYN-ACK packet from server to host and finally a ACK from the host to server and the connection is established. When one sees reams of SYN_RECV on the server it is indicative of a possible attack where a host or a group of them flood the server with the first SYN but they spoof their IP addresses or just snub the server's SYN-ACK packet saddling the server with these half-open connections, each one taking up a bit of the server's resources.

The server eventually cleans up the half-open connections but if the zombie connection numbers rise too fast, they exhaust the server and no more connections are accepted; the server goes offline. This is known as a SYN flood attack and it eventually leads to a condition known as DoS (Denial of Service) or the more dreaded form, DDoS (Distributed Denial of Service).

By now I was late for work, so I just blocked all traffic to the server and went to my day job. The server remained crippled all day (apologies to the users of my utilities) until I returned home and began the process of resolving the issue. I was hoping that by then the attackers had moved on to other targets, but no such luck.

The SYNs of China

I started opening up the permitted traffic little by little (by manipulating subnet rules in iptables), paying attention to the half-connections. With every little opening I would see a flood of SYNs barging in and I would block the IP addresses of some of the bigger offenders. This wasn't exactly helping and it was taking too long. There were just too many IP addresses.

Curious, I decided to look up some of these IP addresses on arin.net and unbelievably all of them had been assigned to China, hundreds of subnets consisting of thousands of Chinese IPs working diligently to knock my site offline. Now it is possible that attack itself hadn't originated in China and the IP's were spoofed, but I would give that a very low probability.

Rescuing the SYN-king Server

It was time for drastic measures to save my Web site and that meant blocking China completely. I hate blocking traffic, it goes against the very spirit of the Internet but at this point I had no option. Thankfully I was able to find a site (cited below) that had a list of IP addresses assigned to China. This is a big and dynamic list and I imported the whole list into my firewall block rules and with that hashemian.com was humming again. For added measure I also hardened the TCP/IP stack on my server a bit to better withstand SYN flood attacks (sources cited below).

As mentioned, it upsets me to have blocked so many addresses on my server and in doing so also taking up server resources. My site is insignificant, but if everyone blocked everyone else, imagine what this fragmented Internet would be like. At my day job I also see a lot of similar abuse coming from China. Other places such as Russia, Nigeria, and Estonia dish out their own abuses, but this sort of heavy-handed, fatal reconnaissance and attack is almost exclusive to China. Why was my site targeted? Was it some sort of a drive-by from a robot that got stuck and kept on hammering away?

I suppose I'll never know. But I know this won't be the end of it. Botnets and attackers are a lot more far-reaching than just China. They will be back with different attacks from different angles, but that's another day and another battle.

List of IP addresses assigned to China
http://www.okean.com/antispam/iptables/rc.firewall.china

Hardening TCP/IP
http://www.symantec.com/connect/articles/hardening-tcpip-stack-syn-attacks
https://www.ndchost.com/wiki/server-administration/hardening-tcpip-syn-flood

A little preview of TCP hardening on Linux:
# TCP SYN Flood Protection
net.ipv4.tcp_syncookies = 1
net.ipv4.tcp_max_syn_backlog = 2048
net.ipv4.tcp_synack_retries = 3

Yahoo Mail Down Again

by @ 10:09 am
Filed under: business,email,web — Tags:

It is hard to say anything positive about Yahoo since Marissa Mayer took over the helm.

Useful services (e.g. Pipes) have either been eliminated or are just languishing (e.g. Mail).  Today Yahoo Mail is down again. Alibaba's stake hasn't turned out to be the savior it was once deemed.

At this point the stakeholders are surely kicking themselves for not taking Microsoft's buyout offer back in 2008.

The future isn't looking too bright for this once thriving vanguard of the Web. Perhaps it is time for new leadership.

Can Jet Beat Amazon?

by @ 4:18 pm
Filed under: business,web — Tags: ,

jet amazonThe idea behind Jet, a new online marketplace site, is simple. Borrowing from Costco's concept, Jet charges its clients an annual fee and in return ships products to customers with no mark-ups and in many cases with substantial savings over other shops, including Amazon.

Being a Costco fan, I like Jet's model. Add to that a good dash of dislike for Amazon and I may actually try Jet at some point. Seems like Jet is having some success getting its name out.

Jet's founder, Marc Lore, was the man behind diapers.com, a once successful ecommerce company who got crushed under the weight of Amazon and finally what was left of it was assimilated by Amazon. Now Lore is back to take on Amazon again, only this time he's going after the entire retail side of the company. Amazon's CEO, Bezos, can't be too happy about this, but is he worried?

I doubt Bezos is losing much sleep and here's why. Amazon may be known for perfecting the online marketplace and for being uber-competitive but Amazon has become adept at thriving while swimming in failure. Which company can lose money for over 20 years and be handsomely rewarded for it? For the last quarter it reported a measly $90 million in profit and saw its market cap rocket up by $50 billion when stocks opened last Friday.

The point is that competing with Amazon is like competing with a bottom-less pit for the bottom. Amazon crushes its competition by spending nearly infinite amount of money knowing that the stake holders expect nothing but losses every year. When you are rewarded for losing money, it's not difficult to spend all the money in the world, and that's what Amazon does to stifle competition.

Of course Jet realizes this fact and it has dug in, preparing itself and its investors for years of losses as its competition with Amazon heats up. Will it find the same love and admiration from its backers and future shareholders? Doubtful, but I actually hope so. Not just because I dislike Amazon, but also because some day I'd like to start my own thriving business that is successful at losing money forever.

Craigslist eBay Motors Car Scam

by @ 1:09 pm
Filed under: internet,web — Tags:

ebay-craigslist-car-scamI'd been in the market for a used car when a too good price on Craigslist caught my attention. I'd sold a street bike on Craigslist a few years ago and had a good experience so figured to go into this but with raised antennas.

An email later, the seller reveals a sob story about the car belonging to her dead husband and wanting to move on. The car's in a great shape with all paperwork in order. Sounds plausible, so can I see the car? Seller replies the car is in some eBay garage across the country in lot number so and so.

No worries, she just needs my info and eBay will contact me about payment. The money will remain with eBay until I receive the car and I have 10 days to inspect it. If any issues, I can return it at no cost to me.

So I ask for the eBay page where the car is listed. Seller says she took it down because of the fees. But really, eBay will make all arrangements.

Yeah, sure man. Of course at this point the full blown scam was obvious, but should have been obvious at Craigslist. A quick Google search revealed that this unholy Craigslist-eBay alliance bait and switch is in fact very popular and a few people have been victimized, buyers and sellers.

So why this post? Just adding one more page to Google's search results to raise the warning volume slightly more.

Read this and stay vigilant. There's plenty more info on this. Just Google it.

Bait and Switch Google Adwords

by @ 12:37 pm
Filed under: google,hacking,web — Tags: ,

We're all familiar with targeted banners these days. Visit a shoe site and suddenly all banners in various web sites are shoe-related.

It seems the banner scammers/hijackers have figured this out too. Recently I noticed suspicious Adwords banners originating from a site called adnxs.com.

My guess is that the malware authors use Adwords or similar networks or sub-networks to target users with certain keywords, for example shoes. They may upload legitimate ads in the beginning and may even run them for a while to gain the network's trust. But then the switch happens and malware ads such as below are displayed.

malware banner

To a lay user, a banner such as above may look legitimate enough to click which will inevitably lead to a malware download and it's game-over for that user. The banner obviously has the tell tale look of being a scam, with the "importent" update it purports to install.

Hard to say if adnxs.com or similar sub-networks are in on the scam or just look the other way as long as the money keeps coming. Whatever the case, browsers and anti-virus programs seem unable to stop these annoying and harmful banners.

Facebook Like, The Big Fake

by @ 6:25 pm
Filed under: google,web — Tags: ,

facebook fake likeEarlier this year this insightful article delved into the business of click farming where people and businesses (and apparently even the US government) pay shady companies a modest fee for thousands of Facebook likes, or Twitter followers, or YouTube views. Only that these likes and clicks are generated by click farms, either malware robots and zombies, or zombie-like people clicking mindlessly, essentially producing inflated popularity through fraud.

I am not much a social media expert or even user, yet I knew about click farming. I just didn't know how extensive the practice was until recently.

At this point we must assume that the vast majority of likes, views and followers are fake. Certainly not everyone is involved, but faced with such overwhelming and obvious scam, one must conclude that digital popularity is now but fiction and holds no credibility. And it doesn't matter who they are, even governments, legitimate companies and celebrities can not be ruled out.

Online scamming is not new. When link farming became a popular method to attain high ranking in Google results pages, Google fought back by changing the rules because SEO scamming was becoming an existential threat to its business. Once users' trust is lost, it is difficult, if not impossible, to gain it back.

Popular social sites such as Facebook, Twitter, LinkedIn, and YouTube are now faced with the same credibility issue and they are fully aware of the problem and have the means to correct it. But it's business as usual because most users haven't woken up to the reality of click farming, yet.

Just like now when everybody immediately dismisses an email purportedly sent by a Nigerian prince, an increasing number of users are glossing over the stats on social sites. When the majority of these stats are fake, the whole system becomes useless and irrelevant.

Stack Overflow on stackoverflow.com

by @ 5:40 pm
Filed under: web — Tags:

Ok, they're down for maintenance but for a second I thought their site had really blown the stack 🙂

stackoverflow

Older Posts »

Powered by


Read Financial Markets  |   Home  |   Blog  |   Web Tools  |   News  |   Articles  |   FAQ  |   About  |   Contact
Bitcoin: 1K9TzBvQ2oaEb4tX9t2vKDtZouMcpfV6QF
© 2001-2017 Robert Hashemian   Powered by Hashemian.com