Hackintosh home project

After fiddling about with Linux, Windows XP, Windows 7, Linux again (different flavours) and totally hating the new Gnome3 desktop, I’m now trying something new for my home project.

Building my own Apple Mac machine. :) But then cheaper!!!

Around this time, my home PC was about to go dead, so I had to buy new stuff. After 7 years of loyal work, I decided to create a new home PC with the Apple Mac specifications in mind. After all, the new macs are Intel based so it is easy to get this to work.
I’ve done some investigation and decided to build one using the following specs in mind:
At least an Intel i5 of the 4th generation, plenty of RAM, 16GB. An SSD drive of 250GB. Small case and quiet, a WIFI adapter.

According to this nice site: http://www.tonymacx86.com/ I could spec my own CustoMac and it had to consist out of a Gigabyte motherboard. The big issue to get the thing working is basically the WIFI stuff to work, and yes, I have only WIFI available at the place I’m sitting currently. Just too lazy to pull some cables in the house. Also, since I’m going for Apple, the bluetooth adapter should work as well because of the Apple keyboard and the magic mouse. (The last one seems the most difficult part due to the Bluetooth detection inside OSX).

I’m new to OSX so I hope I can manage the thing. Fortunately very experienced with all sorts of Unix and when started at CSC (or TRW at the time), I used to work with Mach/OSF i386 machines (in 1997). In fact OSX and iOS still has the TRW branding in all header files of SCSI driver stuff.

Specs:
Gigabyte Motherboard: GA-H87N-WIFI (not using the WIFI because it’s an Intel 7260 mini-PCI-half card = no OSX driver)
Intel Core i5 4570 boxed
Kingston HyperX blu KHX16C10B1K2/16X
TP-Link TL-WDN4800
Samsung 840 EVO 250GB
be quiet! System Power 7 300W
BitFenix Phenom Mini-ITX Zwart
Targus Bluetooth 4.0 Micro USB Adapter
Apple keyboard (wired)
Apple magic mouse

USB3.0 stick to place OSX install files on (no CDROM/DVD on board)
Currently I’ve attached 2x 24″ Illyama screens on the HDMI ports. 1080p.

Building:
Basically the building of the motherboard is pretty straight forward. Just put it together and make sure the small cables are in the right position on the motherboard.

Preparing the software:
Unfortunately I could not get the Apple installation media because of the restriction of me not-having-a-mac and thus not having the Mac version of the App store. Fortunately I could get my hand on a VMware Virtual Machine with OSX and used that from my Windows PC to prepare the USB files. I’ve used Mavericks 10.9.3.

I followed the guide here: http://www.tonymacx86.com/374-unibeast-install-os-x-mavericks-any-supported-intel-based-pc.html
and

http://www.tonymacx86.com/golden-builds/109953-customac-mini-2013-ga-h87n-wifi-core-i3-4340-intel-hd-4600-a.html

Configure the BIOS according the manual.

Inside the OSX Virtual Machine:
Basically what you do is download the Apple OSX, download the install software, use a nice software program to put it all on the USB stick.

Boot the new PC from the USB stick and the installer starts.

When the installer has finished, I had to use the -x bootflag. Otherwise it didn’t start-and-complete the post-installer.

When done, I had to download the Multibeast kext- and boot-installer tool.
Select your drivers, that is:
DSDT free, Soundcard: ALC892
and the rest is pretty much default

Reboot and you are done.

The configuration was quite easy but it took a while to get used to all the screens inside OSX.
I’ve disabled the power save-sleep stuff to prevent issues where the machine doesn’t wake up.

I’ve used on older version of the FileNVRAM to get iMessage to work.

Some things that are not working yet:
I cannot shut the machine down for some reason and it does not wake up when using keyboard/mouse. I’ve digged around and it looks like there is some interference with USB- or PCI card. Probably the Intel WIFI card. I can shut down OSX, but it does not power down the PC. Simular to some Linux bugs related to ACPI.
Simular thing with sleep mode. I’ve disabled powersaving stuff and display power save mode.
Fortunately the whole machine starts up within 5 seconds so no big deal here.
Currently I just use the shutdown option inside OSX what brings down OSX cleanly and after that the power-button on the case to shutdown immediately (configured inside the BIOS to do that to prevent the 5-secs wainting time)

I’m still waiting for the USB bluetooth dongle so my Magic Mouse is not connected yet.
The Intel card on the motherboard has bluetooth AND is also detected inside OSX, but
I’m unable to get something connected. I’ve decided to use an USB dongle with a supported
Broadcom chipset, the BCM20702a0. I found a local reseller who had a dongle with this chipset
with the name of a Targus. Hopefully that works.

The whole setup was about 650 euros, which is not really expensive for this kind of power.
The cheapest Apple starts around that price but you get an Apple mini with i5-laptop slowgrade CPU of 2 years ago and a few MB of RAM. Now I have a 4-core 3.5Ghz fast CPU and 16GB or RAM.
I was initially waiting for the NEW Apple mini but still no word when Apple was going to bring it out. I didn’t want to buy such an expensive machine with these poor specs.
Eventually maybe a Mac book pro, but I first want to see if OSX is something for me.

I will keep you up-to-date

No Comments »

Replace text in file with linux

find . -name '*.php' -exec sed -i "s/from_text/to_text/" {} \;

No Comments »

MySQL database dump directly via SSH

I had a server with almost no disk space and i needed to move the site to another virtual server. Making a database dump locally was no option so I Googled around and found some handy commandline ways to copy the database to the other server.

Dump MySQL database gzipped via SSH to destination server:
mysqldump -uMYSQL_USERNAME -pMYSQL_PASSWORD DATABASE_NAME | gzip -c | ssh USERNAME@DESTINATION_HOST 'cat > ~/dump.sql.gz'

Dump MySQL database gzippped via SSH and import it on destination server:
mysqldump –opt –add-drop-table -Q -u localuser –p localdatabase |gzip -c |ssh remoteuser@remoteserver 'gunzip -c | mysql -u remoteuser –password=remotepassword remotedatabase'

No Comments »

iPhone/iPad website body background image scales wrong when using jpg

I had a website i was working on http://www.indiansummerfestival.nl and i used a body background image in the top header image.
In every browser and on any Android device the body background image was scaled proportionally with the zoomlevel of the website on the device/screen.

But on iPhone and iPad the body background image was scaled to 100% width… On an iPhone the site was properly zoomed out to fit the width of the screen, but the body background image zoomed out more.. this messed up the design.

After spending some time figuring out what could cause this i finally found the solution.
When you save your jpeg as a Progressive jpeg it worked as expected.

1 Comment »

PHP SoapClient exception Error fetching http headers

I had a problem with a soapclient and a strange error “Error fetching http headers”.
It seems the problem occurs when multiple soap call’s are done with a single soapclient instance.
I think the first soap connection is still “active” somehow, resulting in an error on the second soap call.

I started googling aroud and found this post which explains the issue in detail.
Creating a new soapclient for each call was the solution for me.
I hope this post can help someone with the same problem..

Below the original post for archive purposes:
http://php.blogaboutwhatever.com/2011/11/error-fetching-http-headers/

… really is as uninformative error message you can encounter during a call to some SOAP service. At least at the first glance. Digging deeper into the issue there’s some really interesting causes to that one.

Ok, first some more information. We do a SOAP call and sometimes end up with a SOAP fault ‘error fetching http headers’. Took some time to find the real reason for that.

Simple explanation why this can happen is if you have a default_socket_timeout setting of x seconds on your application and the remote SOAP server takes more than x seconds to answer.
So you system gets no response in time and starts complaining that it got no headers.

Reason one: slow remote SOAP service.

Some additional useful information: apache access log by default logs the time the request was started to be processed by apache, but it end up in log at the time the response is sent out. Requests that are not touched by apache do not end up in access log.

Well, we checked all available logs and could rule out a slow SOAP service for certain. No way it could run into a timeout. We could see logged error messages with ‘error fetching http headers’ on the client, but those requests
never ended up beeing processed by the SOAP server’s apache. After a bit of extended logging all looked like the client even never really sent the requests. So we started to search for a cause why the client thought
it had sent the request but in the same second already logged the empty response error.
At first we suspected some PHP SOAP client or openssl or whatever bug that tricked the client into believing it sent a request when it did not.

Alas, my colleague came up with an interesting link:

http://stackoverflow.com/questions/4824799/soapfault-exception-http-error-fetching-http-headers

which finally gave us the hint about where to look.

The client PHP code does the following: get instance of a class, in the class constructor initialize a SOAP client pointing to our SOAP server, make two SOAP calls. Then do something completely different, namely call a second remote system. After that call is done, the formerly initialized SOAP-Client is reused for another SOAP call and that is what went wrong sometimes.

We had a look at the headers:

Client sent this during request: Connection: keep-alive

SOAP server sent this in reponse: Keep-Alive: timeout=5, max=100

So here we have our reason: the server keeps the connection persistent for 5 seconds which is apache default. The client thinks the connection is persistent and it can call the remote server without sending a handshake first on all subsequent SOAP calls. That is ok if all calls can be done during 5 seconds. But if the call to the second remote system that takes place between SOAP call two and three takes too long, the third SOAP call ends up with ‘error fetching http headers’. Keep in mind, there were no unusual server configurations on any side!

Reason two: default setting of PHP and apache clashing with reusing a SOAP client in PHP.

Here we got, first fix: get a new PHP SOAP client for each call, the wsdl is cached by default so that does not slow the process down. At least it did not look like it did when I tested.

Second fix, but that one we only try to make our customer’s life a bit easier: raise Keep-Alive timeout on the SOAP server. That’s not so easy to be done, because we now might end up with more or longer running apache processes. So in time of high access we might run into trouble with a too high load on our SOAP server.

So note all you PHP developers out there: take care when reusing SOAP client instances, you might get hard to find errors. Playing it safe: never reuse the soap client, if you reuse you can try to try/catch the soap fault and retry with a new instance of SOAP client.

PHP SOAP documentation is not really helpful, maybe there’s some way to force SOAP Client into taking care with connections, but did not yet find it.

No Comments »

Stop OSX lion from opening all applications after reboot

One VERY annoying feature of OSX lion is the fact it opens ALL applications you ever opened despite the fact you disabled the “Restore windows when quitting and re-opening apps” feature in the system preferences.
After some googling around i found a solution that actually works!
Here is the solution:

- Open a terminal
- Run the following commands:

$ chflags nohidden ~/Library/
$ cd ~/Library
$ chmod a-w Saved\ Application\ State/

- Reboot

I hope this helps anybody with the same annoying problem.

No Comments »

Increase fastCgi / PHP activityTimeout in IIS7

When your have installed PHP with the Microsoft web platform installer PHP is installed in C:\Program Files (x86)\PHP\v5.2.
When your PHP script runs longer than 10 minutes (600 seconds) you will get an 500 Internal server error.
To fix this issue you will need to increase the fastCgi activityTimeout in IIS7.
After Googling around i found a solutions, but because the web platform installer installs PHP in a path with spaces in it all examples failed.
I Googled some more and after a while i combined 2 solutions and got it to work.

Start a command promt and go to this folder:
c:\Windows\System32\inetsrv

Execute this command to see the current settings:

C:\Windows\System32\inetsrv>appcmd list config -section:system.webServer/fastCgi

Output will be something like this:

<system.webServer>
<fastCgi>
<application fullPath="C:\Program Files (x86)\PHP\v5.2\php-cgi.exe" monitorChangesTo="php.ini" activityTimeout="600" requestTimeout="600" instanceMaxRequests="10000">
<environmentVariables>
<environmentVariable name="PHP_FCGI_MAX_REQUESTS" value="10000" />
<environmentVariable name="PHPRC" value="C:\Program Files (x86)\PHP\v5.2" />
</environmentVariables>
</application>
</fastCgi>
</system.webServer>

To change the timeout to 1 hour (3600 seconds) execute the following command:

C:\Windows\System32\inetsrv>appcmd set config -section:system.webServer/fastCgi "-[fullPath='C:\Program Files (x86)\PHP\v5.2\php-cgi.exe'].activityTimeout:3600"

Not the use of double and single quotes to specify the correct parameters.
After you run this command you will see this output:

Applied configuration changes to section "system.webServer/fastCgi" for "MACHINE/WEBROOT/APPHOST" at configuration commit path "MACHINE/WEBROOT/APPHOST"

After running the list config command again the output now looks like this:

<system.webServer>
<fastCgi>
<application fullPath="C:\Program Files (x86)\PHP\v5.2\php-cgi.exe" monitorChangesTo="php.ini" activityTimeout="3600" requestTimeout="600" instanceMaxRequests="10000">
<environmentVariables>
<environmentVariable name="PHP_FCGI_MAX_REQUESTS" value="10000" />
<environmentVariable name="PHPRC" value="C:\Program Files (x86)\PHP\v5.2" />
</environmentVariables>
</application>
</fastCgi>
</system.webServer>

As you can see the activityTimeout is now increased.
Restart your IIS service and your good to go.

2 Comments »

Connect to Windows Samba share on Mac OSX with SSH

After some googling around i found this post. It’s a very clear howto on connecting to a Windows Samba share on your Mac OSX with SSH. For example your NAS at home from any other location.
I copied it to my own site to be sure it stays on the web.

All credits go to the author.

Chances are if you are seeing this, you’ve tried quite a bit but it hasn’t worked. Look no further. If you are seeing this and haven’t been researching it, then this should still be enough info to get a good start.

First, here’s the point: Using windows file sharing (Samba/SMB) is a good way to access your files across your home network, but don’t even think about trying it over the internet. In order to access SMB shares across the internet you’re going to need to get creative. A method which works reasonably well is using a zero-configuration VPN program such as Hamachi, Remobo, Wippen, etc. to create a virtual lan connecion, thus fooling your computer into connecting like you were on the same lan. That works, but in my experience it isn’t very reliable, it has limitations, it has overhead, and it means you have to have that ZCVPN client on both ends. So here’s my solution, skip the program, jump straight to the solution. If you use an SSH tunnel to connect to your computer, you can access your SMB shares, you can use VNC to view your screen, or do just about anything that uses a port on your host computer. The best part about it is, once you have it up and running, it’s really simple to use!

Note: This post will assume that your “server” machine is running windows and your “client” machine is running Mac OS X Leopard.

Here’s how to do it:

1. Enable file sharing on your host computer (I’m going to assume this is running Windows). This will allow your files to be shared across your local network. If you don’t know how to do that, there’s a very good guide Here.

2. (Optional) Disable simple file sharing and edit the permissions on your shares so that the shares are password protected. You only need to do this if you don’t want just anyone on your local network to be able to access your files. (Google it)

3. Install an SSH server on your host computer, I’d recommend freeSSHd. This will allow your to create a secure connection between your computers. I’d suggest freeSSHd because it’s free and much easier to use than many of the alternatives (OpenSSH/Cygwin).

4. On the SSH tab in the freeSSHd settings, change the port to whatever port you want, I’ll be using 12345 in my examples. I’d recommend something in between 10000 and 50000 so that a network scanner is less likely to pick up the port.

5. On the Users tab in freeSSHd, add a user with the username and password of your choice, set your password as “Password stored as SH1 hash”.

6. On the Tunneling tab in freeSSHd, enable local and remote port forwarding.

7. Test your SSH server to make sure you can connect to it using a computer on the same network as the SSH server. You will need the local IP of the SSH server for this step you can find it using This guide.

To test it from your mac machine:

Open the Terminal (Applications/Utilities/Terminal)
Use the command ssh -p port username@hostip (Example: ssh -p 12345 lococobra@192.168.0.2)

8. Enable port-forwarding on your router to your SSH server at the port you used – Follow one of the guides for your router Here but use the port for SSH (12345)

9. (Optional) Set up an automatic DNS server for your host computer, you can set that up Here for free. I’d really suggest you do this, its very useful! Once you have that set up, install the No-IP Dynamic Update Client so that your DNS always matches your dynamic IP.

10. Test your SSH connection via the port forward. This is almost exactly the same as before, except instead of using the IP you got from ipconfig, use your global IP (or the DNS you set up in step 9). You can find your global IP Here.
Example: ssh -p 12345 lococobra@myDNS.hopto.org

Now that we have all that set up, we’re almost done. What we’re going to do is connect via SSH and forward the SMB ports from our host computer to our client. This will allow you to access your shares remotely. It works because your ssh/smb server will think that it’s directly connected with your client computer, when in fact the connection is all handled through SSH. The tricky part is, OS X Leopard will not allow you to do this. If you forward the SMB ports from the server to client computer, then the client will think that it’s connecting to itself, and so Leopard will deny the connect. In order to defeat this we’re going to have to work some magic.

11. Set up an alias for your loopback connection (localhost/127.0.0.1) on your Mac. This will fool your computer into thinking it’s connecting to an external IP. This command needs admin privileges, so you have to use sudo. The command is:

sudo ifconfig lo0 127.0.0.2 alias up

This will create a temporary alias for your loopback connection which will stay active until the computer is restarted.

12. Edit the all users configuration file for your SSH settings so that you can connect quickly without setting it up each time.

Open the Terminal and run sudo pico /etc/ssh_config
Enter the following text above the line that says ” #Host *”, change the user and port to the ones you have used in your SSH configuration.

Host AliasForHost
HostName hostip
Port 12345
User YourUserName
ServerAliveInterval 200
ServerAliveCountMax 3
LocalForward 127.0.0.2:139 127.0.0.1:139
LocalForward 127.0.0.2:445 127.0.0.1:445

Keep in mind that you can add any number of ports to this list. For example, if you want to connect to VNC, add 5900 to that list. Then to use VNC, connect to 127.0.0.2:5900.

Hit control + x, Y, and enter to save the file. Since we’re saving it as a dotfile (there’s a dot at the beginning) you won’t be able to see it. If you need to edit it again, you can do it through pico the same way.

13. Initiate the SSH connection with your host computer using the host alias we set up before.

sudo ssh AliasForHost

14. Connect to the Samba share. Open a Finder window and hit command+ k to open a Connect to Server windows. For the server address, use:

smb://127.0.0.2

Now click Connect, and if everything went well you should be prompted with a window to enter your Login credentials for the server machine!

Wow, that was complicated, but at this point it doesn’t need to be. Here’s a little AppleScript I came up with to automate the connection. (Don’t worry about running the ifconfig over and over, it won’t hurt anything)

set Command to “sudo ifconfig lo0 127.0.0.2 alias up; sudo ssh AliasForHost”

tell application "Terminal"
if (count of windows) is 0 then
do script Command
else
do script Command in window 1
end if
activate
end tell

You can save that script as an application using the AppleScript Script Editor and run it to automatically run those commands.

I know for most people that post was probably really confusing but I tried! If you need help please comment or something. I’ll get back to you.

Edit: Take a look at Fredrik’s script in the first comment for an even more automated solution for connecting and mounting.

Edit2: I found a much more efficient way to actually initiate the connection using a host alias, take a look at the part about the ssh_config file

No Comments »

Must have Mac OSX Apps: RCDefaultApp

RCDefaultApp allows to set the default applications used for various URL schemes, file extensions, file types, MIME types, and Uniform Type Identifiers from within a very easy interface. You can manage all your default settings from a single point. A handy feature is the ability to be able to override the default application for new files. For example, if you save a PSD with photoshop but you for example always want to view PSD’s with Xee, you will be able to configure this.

The application is freeware.
Go to the homepage of the author here:
http://www.rubicode.com/Software/RCDefaultApp/

No Comments »

Must have Mac OSX Apps: Xee

Xee is the ultimate replacement for the zero-feature image viewer that comes with your mac.

The features of xee are:

  • Display a large number of image formats – any format QuickTime or Preview
    can open, plus several more, including PCX, Maya IFF and Amiga IFF-ILBM.
  • Easy browse through folders of images – open any file in a
    folder and use the toolbar, keyboard shortcuts or mouse wheel to
    view the other images in the same folder.
  • Browse image inside archives, using the uncompression engine from
    The Unarchiver. It can read almost every format
    The Unarchiver can, which include Zip, Rar, 7-Zip, Lzh, ISO and
    StuffIt. It also supports the CBZ and CBR formats, which are just
    renamed Zip and Rar files, respectively.
  • Effortlessly copy, move, rename and delete of images while viewing.
  • Losslessy rotate and crop JPEG images. This lets you edit your
    digital photographs without losing quality by re-compressing them like
    most other editors do.
  • View more EXIF data for JPEG files than Preview, and also other kinds of
    metadata, like XMP or IPTC. It can even try to identify what program or
    camera created a JPEG file by analyzing its quantization tables.
  • Extract bitmap images from inside PDF and SWF files. Many PDF files contain
    scanned pages in bitmap form, and Xee can read these and show them as
    bitmap image, and even save them. The same goes for bitmap images inside
    SWF files.
  • View images in full-screen.

Get it here: http://wakaba.c3.cx/s/apps/xee

1 Comment »

Next »