Uncategorized – Alex Ivaylov – a web developer http://www.alex.bg Alex BG Web Developer Blog Sun, 16 Dec 2018 14:23:40 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.8 About Zend http://www.alex.bg/2018/12/about-zend/ Sun, 16 Dec 2018 14:16:57 +0000 http://www.alex.bg/?p=950 PHP was created in 1994 by Rasmus Lerdorf. He is still the lead of the PHP group today. It was a template layer for his software that was written in C. He open sourced it and people loved it. More and more functionality was added over the years and the language grew significantly from its original idea.

In the years 1997-1999 two students named Zeev Suraski and Andi Gutmans re-wrote the core of PHP and released it as version 4. This new core that they wrote was named the Zend Engine. “Zend” comes from their names. This is still the main engine that operates inside PHP today. In 1999 they founded the company Zend technologies. In the years between 2000 and 2010 Zend technologies receives a number of findings from various investors.

Over those years, they are the main contributor to PHP. They call themselves “the PHP company”. Their business model is offering paid PHP products and services to enterprise-class customers while helping and grow the open source PHP community.

They also created a number of products:

Zend Framework – a modular PHP MVC framework targeted towards the enterprise-class. It brings the Java enterprise patterns to PHP. It consists of professionally written packages. Those packages are (mostly) independent and managed by composer.

Zend Studio – a commercial PHP IDE based on Eclipse. Specially made for Zend Framework development. They also contribute the PHP developer tools (PDT) to Eclipse (IDE) foundation which is available for free. In my opinion Zend Studio still offers the best debugging capabilities today but not the best development capabilities (phpStorm is the market leader IDE today).

Zend Server – a custom PHP stack with a lot of extra goodies. Those goodies are Z-ray, better debugging, better event reporting, better monitoring, automatic deployment, etc. This is a fantastic product. When I discovered it I was wander how have I lived without it before.

Zend Guard – was a PHP obfuscator. Unfortunately it is now discontinued.

I have to say that I love their products. Zend Framework was the first framework I tried in 2011 and I was impressed how much easier it was than vanilla PHP. This must have been version 2 which had the Zend CLI tool for creating modules, controllers and other boilerplate code. At that time Zend Framework was the market leader.

In 2015 a company named Rogue Wave Software acquires Zend Technologies. Shortly after that Andi Gutmans leaves the company and gets a job at AWS.

This is now a tough time for Zend because more and more PHP frameworks are popping up from everywhere and they are stealing market share from Zend. At the same time NodeJS showed up which is also stealing market share from PHP.

The current market leader in PHP frameworks is Laravel. It stole the show because it is much easier to learn. It is SOLID based with a very good Inversion of Control (IoC) container that provides automatic constructor-based dependency injection. While the core component of ZF – the Zend Service Manager uses the service locator (anti)pattern. Laravel provides these facades that offer almost all of the functionality a modern web app would need and they are very easy to learn. However Laravel is not enterprise-grade. It doesn’t force modules on the user like Zend. Laravel is easier to get into and to learn. It has better documentation.

Zend Framework version 3 was released in 2016 but some aspects of it are still unfinished in 2018. It still doesn’t have a CLI tool and other things. If you look at a tutorial for ZF3 you will see you have to create like 10 folder structures manually in order to make a module. At the same time Zend Framework being an enterprise framework – they can’t break existing code so they have to stay backwards compatible. This means they can’t change much. For that reason, they decided to create a new framework to try to get some market share back from Laravel. They called the new framework Zend Expressive. This new framework takes all the good things from Laravel. But it does it the Zend way – everything is decoupled into separate composer packages. You also have modules. Expressive doesn’t force us to use their service manager or their routing component – we can use others. Expressive provides only the framework glue which holds those components in place along. It also provides the middleware which is something we see in Laravel. However in expressive almost everything is middleware while that’s not the case with Laravel.
At the same time I get the feeling that Rogue Wave are reducing the resources these guys get. There are 2 people that work on the Zend frameworks and another 2 people that work on the Zend engine. The Zend product portfolio becomes outdated. Zend studio doesn’t support version 3 of the framework or Expressive. Zend server doesn’t support them either. Documentation is poor. The CLI tool works for Expressive but doesn’t work for Zend Framework 3 (aka Zend MVC). At the same time Laravel and NodeJS keep on getting better.

The reason why I am writing this is because in October, Zeev and the other 3 people that work fulltime on those Zend products announced that Rogue Wave has decided to move resources away from all the Zend products apart from Zend Server. These are very bad news for PHP…

Although we did see PHP 7.3 recently, we are not likely to see PHP 8 any time soon if this is true. We are expecting JIT in PHP and JS-style asynchronous support in Zend Frameworks (via Swoole).

Rogue Wave were quick to assure us they will continue their support for PHP but their actions speak otherwise. If they do this – PHP is left without a commercial backer. It will be an entirely community-run project. Which means things will happen a lot slower (if they happen at all).

It’s now December and I have been watching these guys – they have been committing as usual and I really hope things won’t change..

So what should Rogue Wave do?

I think that they have the classic problem of business people and developer people being disconnected and not understating each other.

They shouldn’t look at the numbers directly – now they see “Zend Server sells best so lets kill everything else”. However, what they forget is that Zend server is tightly coupled to everything else. It links to Zend Studio, PHP (Zend Engine) and the Zend Framework. Without all of these – there will be no Zend Server.

They should invest into updating and finishing those great Zend products.

After that invest in making them more accessible. Laravel offer video lessons (known as Laracasts) that cost $9 per month. Zend offer live training that costs $1000 per session. May be look into a cheaper option considering the market has changed? How difficult would it be to record those live training sessions and offer them cheaper?

Back in the day, Zend tried to do an alternative to AWS and Google Cloud Platform. I am not sure why they failed. It probably wasn’t finished. However, both AWS and GCP lack good PHP support. Zend would have been perfect for this service. And with 80% of the web running on PHP – why would anyone go to GCP where they only support PHP 5?

I can’t believe they hold the keys to a technology that runs 80% of the web and they want to kill it because they don’t know how to profit from it.

I have worked with Zend consultants in the past. The first time I approached them they just didn’t understand the requirements. In the end they said they can’t deliver and redirected me to another company. At the same time they sold me Zend Server. And when I went to their website and I tried to buy it – I couldn’t. I had to contact the help desk of Rogue Wave where they eventually took my money.

That’s the other problem – make those products more accessible. I am sure “the PHP company” can put a “buy now” button instead of forms where they take my details and then some sales representative contacts you (or doesn’t contact you like in my case).

If anyone at Rogue Wave or Zend is reading this and wants to talk to me – feel free to click the “Contact me” button above.

]]>
Show PHP server2server connections in Fiddler (part 2) http://www.alex.bg/2018/12/show-php-server2server-connections-in-fiddler-part-2/ Sat, 15 Dec 2018 01:39:59 +0000 http://www.alex.bg/?p=944 This is the second part of my original article “Show IIS PHP server2server connections in Fiddler”. Here I will be talking about the PHP’s CURL extension.

However there are differences in this scenario:

We are not using IIS – we are using Zend Server on my local MacOS machine (Zend’s custom LAMP stack with extra goodies). However standard PHP installation should be the same. If you are using docker there will be differences with your network stack (the IP address below should be different) but you probably already know that.

We are not using Fiddler on Windows – we are using Charles on MacOS. However this is irrelevant because both Charles and Fiddler are very, very similar. I prefer Fiddler but since I am on a Mac – I am using Charles. Both proxies listen on port 8888 and show you the contents of the http requests that go through.

As mentioned in part 1 there is no universal way to tell PHP “use this proxy server for everything”. There is no universal proxy setting in php.ini. For the curl component we can set these runtime options:

curl_setopt($ch, CURLOPT_PROXY, '127.0.0.1');
curl_setopt($ch, CURLOPT_PROXYPORT, 8888);

This will work fine with plain http but. But it would give you SSL handshake failures for HTTPS. This is expected as the http proxies use their own Certificate Authorities (CAs) to cheat the system that the certificate they present is valid. However on MacOS libcurl and php-curl don’t use the system CAs so we need to tell them to use the CA root certificate we want.

To get the Charles CA root certificate, click Help > SSL proxying > Save Charles root certificate. Save the new pem file somewhere on your system. After this we set the .pem file as the CURLOPT_CAINFO option:

curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, '1L');
curl_setopt($ch, CURLOPT_SSL_VERIFYSTATUS, 0);
curl_setopt($ch, CURLOPT_CAINFO,'/usr/local/zend/bin/charles.pem');

This should show you the server2server connections in Fiddler/Charles. Use this on dev environments only.

PS: Obviously there is no way to intercept pinned SSL sockets.

]]>
Show IIS PHP server2server connections in Fiddler http://www.alex.bg/2017/04/show-iis-php-server2server-connections-in-fiddler/ Mon, 24 Apr 2017 19:56:02 +0000 http://www.alex.bg/?p=872 So I am running PHP on Microsoft IIS web server. I have to open some sort of a socket to a remote host. I would like to inspect that socket because I would like to see the data being exchanged between PHP and the remote web service. In our case you could call this a server to server connection. But remember that as far as the remote web service is concerned PHP is acting as a client. Fiddler is an irreplaceable tool. I have been using it for years and I really can’t imagine my life without it.

Fiddler itself is a web debugging proxy. When it is started, it creates a proxy service (by default on port 8888). It also configures Windows and the browsers to automatically route the traffic through that proxy but only for the current user. After that you see a list of all http(s) requests that have went through the proxy and you can click on one to view all its of its parameters. As a bonus it can render XML, JSON or HTML. This is priceless.

However, if you are on a Windows Server box with PHP running on IIS. It is highly likely that IIS is running under a different user so server2server connections will not show in fiddler. Do not try to configure the IIS settings via the IIS console or web.config as that will not work as those are specific for .NET apps and do not affect PHP. Remember PHP is running in FastCGI mode.

What we need to do is we need to tell PHP to use the proxy server. I was surprised that there are no Proxy settings in php.ini. The recommended way to do this is using the stream_context_set_default() function.

$stream_default_opts = array(
'http'=>array(
'proxy'=>"tcp://127.0.0.1:8888",
'request_fulluri' => true,
)
);

stream_context_set_default($stream_default_opts);

As with most of the other PHP environment setters – this needs to be at the start of the PHP file. If you would like this to be global server setting, you can save the above snippet in a .php file on your server and then you could specify it to be prepended (i.e put in the start) of every other PHP file automatically in php.ini like this:

auto_prepend_file = "/path/to/proxySetter.php"

To make things more complicated, this method does not apply to all connections. If this doesn’t work – you will need to research how to set proxy for whatever objects you are using. In my case where I am using the SoapClient class, I have to explicitly specify the proxy host and port:

$wsdl = 'https://example.com/path/myWebService.wsdl';
$options=array(
'trace' => 1,
'exceptions' => 1,

'proxy_host'=>'127.0.0.1',
'proxy_port'=>'8888',

);

try
{
$client = new SoapClient($wsdl, $options);
}
catch (Exception $e)
{
echo 'The web service call failed with error "';
echo $e->getMessage();
echo '"';
exit();

}

If all of these fail (or you are too lazy) you could try a program called Proxifier but I can’t guarantee it will work.

Hope this will help someone out there 🙂

]]>
Copy Mac OS X installation from physical to virtual machine http://www.alex.bg/2016/04/copy-mac-os-x-installation-from-physical-to-virtual-machine/ Sun, 10 Apr 2016 14:11:07 +0000 http://www.alex.bg/?p=861 I am writing this guide because I couldn’t find a working solution to the problem where you might want to clone a complete installation of Mac OS X on a physical hardware Mac to a Parallels virtual machine. In order to do that you would need to take full DMG image of the whole hard drive. Most solutions out there suggest that Parallels should automatically prompt you to conver the DMG file to its .HDD format when you add it to the VM, so you should try that first. However, that method did not work for me.

Before trying this method, you should also try to take a complete Time Machine backup and then try to restore it on to the VM. If time machine backup / restore method doesn’t work, follow this guide. Unfortunately it didn’t work for me either. Before trying this method you would need an external hard drive that is bigger in size than the disk space you have used. This guide was written assuming that you will be taking the installation of an old Mac and putting it inside a VM on your new mac. However, it should also work if you want to reinstall OS X on the same box (but I haven’t tested it this way).

1) Plug the external hard drive into the old mac
2) Turn on the old Mac while pressing multiple times Cmd + R during start up, in order to enter the start up recovery options
3) Choose Disk Utility
2) Take an image of the whole hard drive and save it as a .DMG file on the external hard drive
3) Once you have your DMG on the external hard drive, unplug the hard drive, restart the old mac and plug in the hard drive into your new mac.
4) Copy the DMG file into a folder somewhere on to your new mac (For example in “Documents”)
5) Mount it (by double clicking the copied DMG file). Let the system verify if it can open the image before it mounts it (dont skip verification – it might take some time depending on the size of your drive)
6) Optimal: Browse the files to make sure you are happy all your files are there
7) Unmount it
5) On your old mac: open parallels (If both your old mac and new mac have the exact same version of Mac OS X, you should use your new mac instead)
6) Create a new VM installing Mac OS from the recovery partition
7) You should now have a virtual machine which has a clean, working version of Mac OS that is exactly the same as the OS X version on your old mac.
8) If you used your old mac, copy the VM to your new mac (Documents/Parallels). On the new mac choose “add existing VM” and add the clean Mac OS from the old mac
10) Open the settings of the VM
12) Click the Add (“+”) sign and choose hard drive.
13) Create a new blank hard drive which is bigger in size than the disk space used in the the DMG file and make sure it is expandable (dont worry if your old hard drive is bigger than your new one – just make sure you have enough space to cover the used disk space)
14) Turn on the VM and go into the clean Mac OS installation
14) Open the Disk Utility app
17) Click File, Open Disk Image
18) Go to the Parallels shared folders
19) Find your copied DMG file and open it
20) You should now have your old hard drive mounted into your clean installation VM
21) Unfortunately, doing a recovery to the empty disk from the disk utility did not work for me here. For that reason I went out and looked for a 3rd party solution. I used Carbon Disk Cloner. You can get a free trial from their official website bombich.com
22) Install Carbon Disk Cloner inside your VM and open it. Choose the source to be your mounted disk image and destination to be you blank hard drive
23) Click “Clone”
24) Once the clone completes, turn off your VM
25) Go into the VM settings and delete the clean installation hard drive without moving files to trash (by clicking the “-” sign). Make sure you leave only what was the blank HDD
26) Boot up the machine – you should now have your old installation working
27) Install parallels tools (from the Parallels Action menu on the top)
28) Optimal: If you are happy that everything is working you should backup up your VM (you can delete the copy you made in step 8)
29) Optimal: After you have backed up your VM you can delete the clean install OS hard drive to free up disk space (go to Documents/Parallels/ and right click on the VM, select “Show Package contents”)
30) Optimal: Delete the DMG file to free up space

]]>
a new fancy dress website http://www.alex.bg/2011/07/a-new-fancy-dress-website/ http://www.alex.bg/2011/07/a-new-fancy-dress-website/#respond Thu, 28 Jul 2011 09:36:45 +0000 http://www.alex.bg/?p=636 Me and the guys from Sco.Biz have just finished working on the Chicken Shop’s website: http://chickenshop.co.uk

]]>
http://www.alex.bg/2011/07/a-new-fancy-dress-website/feed/ 0
mass website hacking + LAMP vs .NET comparison http://www.alex.bg/2011/04/mass-website-hacking/ http://www.alex.bg/2011/04/mass-website-hacking/#respond Sun, 03 Apr 2011 20:16:22 +0000 http://uk.alex.bg/?p=591 As I am writing these lines, history is happening. The biggest mass-hacking campaign in the history of the web is happening right now. The campaign is named LizaMoon and more than 1.5 million websites have already been hacked and counting.

The infection uses vulnerability in the security of the Microsoft .net based web applications. The two worst nightmares me and every other web developer have are SQL Injections and XSS. While everyone is talking about XSS, I was left with the impression the SQL Injection vulnerabilities are not that common anymore and that XSS was a bigger issue.

I am glad that this happened in the times when everyone is thinking about moving from PHP to .NET. Let me show you a comparison between LAMP and .NET

.NET LAMP
Closed source – only Microsoft knows what’s inside the code Open-source everyone can see what the code is.
You must pay top use it You don’t have to pay to use it – it’s FREE!
Every 3 years a new version comes out and you need to pay to upgrade. Corporations have to spend millions on licensing in order to upgrade which they do not do. As a result the infrastructure gets out dated – millions of organisations are using Microsoft software more than 10 years old which is out of support. New versions come out often and its free to upgrade. Corporations are free to upgrade and don’t need to pay anything.
You must pay Microsoft to get support Millions of people from the open-source community are ready help you in Forums, Newsgroups, IRC.. And they don’t want a penny from you.
Developed by people being paid to work 9-5 working under pressure to chase deadlines and in the rush they miss many things Developed by people who do it because they love it. Then the code is released and other people help them improve it. All done because they love what they are doing and available free.
Compiled code – the only advantage I can see. The code is in machine language so it’s executed faster The code is in “human” format and the machine has to be interpreted every time the website is requested which slows it down a little.

From the above table my conclusion is that if you choose to develop in .net you choose to pay Microsoft all the time because this is Microsoft’s business model. They don’t care about you – they want your money. And now that the biggest attack is happening because of a problem in their software and they are sitting quiet about it.

We know that LizaMoon is a SQL Injection, but we don’t know what the vulnerability exactly is. With such a large number of websites infected I would guess it’s something in .net on a lower level but this is just a guess.

They already said it’s not their fault:

Microsoft is aware of reports of an ongoing SQL injection attack. Our investigation has determined these sites were exploited using a vulnerability in certain third-party content management systems. This is not a Microsoft vulnerability.

Now that the bad guys have a database with all the vulnerable websites its only a matter of time until they launch their next attack. God knows what they are going to do next.

Good luck to all of you guys fixing this, I know some of you have had sleepless nights. I know god will help you! Don’t lose faith!

Latest information about lizamoon

]]>
http://www.alex.bg/2011/04/mass-website-hacking/feed/ 0
LAMP: installing imagick on Ubuntu and CentOS + some thoughts http://www.alex.bg/2011/03/lamp-install-imagick-on-ubuntu-and-centos-plus-some-thoughts/ http://www.alex.bg/2011/03/lamp-install-imagick-on-ubuntu-and-centos-plus-some-thoughts/#respond Fri, 11 Mar 2011 12:49:58 +0000 http://uk.alex.bg/?p=580 So I need to get Imagick php extension for an old web system i’ve got to work on these days. This is made so easy on Ubuntu:

root@webdev:~# apt-get install php5-imagic

Then just restart the web server:


root@webdev:~# /etc/init.d/apache2 restart

And that’s all! One line in ubuntu and we have that extension working. Life’s good!

Well it’s not, because we decided that Centos will be a better choice for a server operating system. The reason for that is because it is the free version of Redhat and it’s supposed to be “enterprise class”, so as you know in the IT industry, business class products are supposed to be better.

So I have to install imagick on a centos web server. Of course, the Ubuntu (Debian) packages do not work on Centos and there’s no apt-get (it uses yum instead which is not that sophisticated in my view).

So let’s see how we do it here. First of all, we need to get the main imagick package. That’s easy enough:

[root@webdev ~]# yum install ImageMagick.i386

And then the dev package:

[root@webdev ~]# yum install ImageMagick-devel.i386

Then I need to install the imagick php extension via pecl. This will require some compilation from sourse, so we need to ensure we have a compiler. So if you dont have gcc, you need to install it:

[root@webdev ~]# yum install gcc

Then we need to install via pecl:
[root@webdev ~]# pecl install imagick

Then we need to add “extension=imagick.so” to php’s configuration:

[root@webdev ~]# echo “extension=imagick.so” > /etc/php.d/imagick.ini

/etc/php.d is a directory which php scans on start up and loads every “.ini” file in there. What this exersize does is, it creates a new file called “imagick.ini” in that directory and puts “extension=imagick.so” as it’s content.

And then we need to restart the web server:

[root@webdev ~]# /etc/init.d/httpd restart

After that, we need to check the php extensions:
[root@webdev ~]# php-m

Notice it’s httpd (not apache2).

So why does it have to be different? Why does the name of the process daemon have to be apache2 in Debian and httpd in redhat? Can it not be httpd everywhere? Just the same as the way we have (sshd/mysqld/ircd/etc)? Why do I have to compile? It’s freaking 2011!

You see because of problems like that, we will have big players such as Microsoft. It’s just a lot of pointless hassle! Hassle for us, hassle for the developers who, instead of focusing on improving the product, they have to waste days packaging for different distributions.

And of course, now I will all get all those fans of different distros arguing how compiling is better and how we are lame because we prefer the ubuntu approach. But no matter what you say, you can not convince me that I am wrong and that compiling from source is better than apt-get.

I personally feel that centos so far has been nothing but hassle and I prefer ubuntu-server.

]]>
http://www.alex.bg/2011/03/lamp-install-imagick-on-ubuntu-and-centos-plus-some-thoughts/feed/ 0
about WYSIWYG editors / green web development http://www.alex.bg/2010/11/about-wysiwyg-editors-green-web-development/ http://www.alex.bg/2010/11/about-wysiwyg-editors-green-web-development/#respond Sun, 21 Nov 2010 23:05:31 +0000 http://uk.alex.bg/?p=506 I hate WYSIWYG web editors and I never use them. Read one of my stories about Dreamweaver. The idea of this type of web editors is great but it will work properly may be after 5 or 10 years when HTML5 is the current web standard and all the web browsers are based on similar standard rendering modes (we are getting there just hold on tight). I’ve read the book “Weaving the web” by the creator of the web Sir Tim Berners-Lee where he states that he wants people to be able to publish materials on the web without the need of html or any other technical knowledge. Well, I am afraid that’s not possible. Not yet.

A very funny example: one guy comes into the office and tells me: “I bought web hosting with domain name from fast hosts (everybody in the UK buys from there for some reason) and then I uploaded a document but when I open the website, I can’t see it.”… The first thing I asked was “What format is the document.” And he said “It’s an office word document” and I answered “I am afraid that’s not a web format, you will need to convert it to HTML” his answer was “Convert it to what?” And then I showed him how to use ms word to save a document as html (never, ever do this by the way) the guy took notes what to click where and and went home excited that he learned something new and he will manage to get on the web. But he came back on the next day and said to me “I saved it the way you told me and uploaded it but still cant see it..” and the next thing I asked was “Whats the name of the file you called it?” he said “A di ken” (that’s Scottish for “I don’t know”)… “I think it’s something like welcome.html” since I am a nice guy and I wanted to help him, my answer was “You will need to rename it to ‘index.html'”. The guy wrote that down and I’ve never seen him since then so I assume he managed to get it so that people can see it. This example shows that you need to be at least a little techincal in order to publish something on the web for everyone to see. Of course, there are solutions such as sitecreator and other social network like facebook, tweeter, blogger, etc but they are not what the user wants they are limiting him to what they can offer him and no matter what service is created, you will always have limits if you are not a techie-nerd like me 🙂

WYSIWYG editors are great if you simply have one empty white page without any design. But this is 21st century and such pages are only history. Here comes another problem: people who have paid money for a website want to be able to change their website themselves. There is a few solutions to this. The best option is to get your customers to learn (x)HTML/CSS but more than 99.9% of them are not technical enough to understand it and they have their own business so they can not waste time for that for these reasons this solution is not acceptable. The next good solution is to use some sort of web Content Management System (CMS) but CMSes usually have a lot of requirements and are very complex and expensive. While static html pages are just a few files sitting on a web server somewhere, a CMS is a web application where a lot of web technologies such as php/mysql etc are combined to create a website . For that reason, their development and customization is very expensive and not everyone can afford that. Another disadvantage of CMSes is that because they are programs, they have to be run by the web server, everytime someone visits your website. Most of these CMSes are complex and they communicate with some sort of database management system (such as MySQL) and they make the web server use a lot of resources, by using these resources, the server uses more energy as well. While the browser is communicating with the server, the connection is relayed through a few other servers which are part of the so called network route. In order to get to thorough this route, the data needs to go through miles of cables and a lot of switches/routers/filters. All these are using electricity power. And yes, I am one of those web developers who think green.. did you know that every time you do a Google search, the same amount of energy is wasted that is enough to biol a cattle?

Most of the web users I’ve seen, dont know how to use their browser’s address bar. They usually have some sort of different search like ask or MSN on their home page. So in MSN, they type “Google” after that they go to Google and they type the web site they are going to. That is usually something like “bbc”, “facebook” or their email “hotmail”. So instead of just typing bbc.com in their address bar, they have wasted the same amount of energy that is enought to boil three kettles. And they are not the just one or two, there’s millions of them! The problem is that people don’t know that… Someone needs to go and tell them. I think that there should be a TV campaign explaining it.

You are probably wandering what do WYSIWYG editors have to do with green IT? Well, I’ll tell you now: What you see on your screen at the moment, is a result of what your browser (the program you use to see this website) has generated (rendered). Your browser renders the source code of this page which are the instructions on what to display and do (right click with your mouse and select View page source to see the kind of stuff I work with). This means that your browser does the work. And for that reason, it makes your PC use more energy. In general, browsers render well-formated valid code, way better than bad messed-up code. So your browser will work faster and with less energy if the website is well-formated. Unfortunately, WYSIWYG editors always generate messed up code which is more work for the browser. Now imagine that you have millions of computers using more energy?

I’ve heard a lot of people who call themselves web professionals, who don’t know anything about simple HTML and use programs like Dreamweaver stating how good they are. If you are looking for someone to do a web job for you, first ask them what they think about Dreamweaver. If they say the same as what I’ve said on this website, hire them, but if they start admiring it and try to lie to you how good it is.. run away!

]]>
http://www.alex.bg/2010/11/about-wysiwyg-editors-green-web-development/feed/ 0
What software I use http://www.alex.bg/2010/11/what-software-i-use/ http://www.alex.bg/2010/11/what-software-i-use/#respond Sun, 21 Nov 2010 21:55:20 +0000 http://uk.alex.bg/?p=501 As you know I am a LAMP web developer. LAMP stands for Linux, Apache, PHP and MySQL that are all free open source software products working together to bring you this website and most of the rest on the web.
Linux is my operating system. On my servers I use CentOS and on my personal computers/laptops I use Kubuntu (which is Ubuntu with KDE).

Speaking of LAMP, why dont you like my Facebook page about LAMP? I post a lot of interesting stuff there. Thanks!


Everywhere, I use Apache web server. When I am working on something it’s usually PHP scripts under my web-public directory (/var/www/vhosts/…). For my PHP coding I use gPHPedit which is a great Linux application and for graphics I use Photoshop CS5 (run by Wine). I am trying to get used to GIMP but it just feels so different.
Sometimes, I might feel too lazy to download a file and edit it locally so I spend hours working on php code under ssh using a text-based editor such as vi or nano.

When I am doing more front-end web work, such as XHTML/CSS i use the great editor Bluefish. Which could also be run on Windows. You could try it.

My browser is (of course) Firefox which is the best browser on the market. I got a few addons installed on my browser:
1. Firebug (number one web development addon)
2. Web Developer Toolbar (a great extension that has tons of features)
3. Colorzilla (color picker from Firefox, very helpful)
4. Firefox Sync (all my bookmarks, my browsing history and all my passwords and other form data are being synchronized across all my PCs and laptops automatically – priceless!)
5. Extended statusbar (the only thing i missed from Opera was the status bar, that was giving all the details of what’s going on so that’s the solution for Firefox)
6. LiveHTTPheaders (I sometimes need to watch the things on a lower level to identify problems – this is the tool for that)
7. World IP (So many web servers/hostings/VPSes…. this tool helps identify which one is the current site on, by showing the IP and the organization that the IP range is registered to (used to do a nslookups before this extension all the time))

Some of you might be thinking “This guys is a noob, he only has 7 extensions… I got like 50”. There is a reason for me using only these 7. The less extensions you have on your Firefox, the better, faster and more reliable it will work. Especially if you have about 20 tabs open all the time like me.

I also have virtual machines with windows but they are only for testing my web products under Inernet ExploDer.

Well.. thats all for now 🙂

P.S. Have you liked my Facebook page yet? 🙂

]]>
http://www.alex.bg/2010/11/what-software-i-use/feed/ 0
These days http://www.alex.bg/2010/10/these-days-3/ http://www.alex.bg/2010/10/these-days-3/#respond Wed, 06 Oct 2010 01:03:44 +0000 http://uk.alex.bg/?p=496 I am now just back from Bulgaria and I am starting work on a few projects. The most interesting one is a content management system (CMS), which I am doing from scratch for a client. You don’t realise how much work is involved and how much functionality there is behind a CMS (such as Joomla, wordpress, drupal, etc) until you start doing one like them yourself. At the moment I am doing a system for managing self describing data (I am not talking about XML), which means managing the data about the data, which is provided by the user and doing advanced operations and calculations with it.

]]>
http://www.alex.bg/2010/10/these-days-3/feed/ 0