Friday, June 30, 2006

Noble Sales and Ignoble Sales

It has been a weak of business development and pitching which has mean that the server has been feeling a little bit neglected, and I can hopefully put that right today at some point.

On another note though I saw over on Seth Godin's blog this post.

Now I've worked at the sharp end of sales, at the marketing coalface and even on the production room floor so I do feel qualified in commenting on this.

The problem is that sales isn't a bad thing, it is just all too often done badly by people who have no real bond to the company or the services beyond their commission cheque each month and even less regard for their clients and customers. I have seen it all too often and frankly I despise that kind of sales person with the kind of passion I usually reserve for traffic wardens, and London Transport.

The reason for this is because I think the real skill of selling is not winning the first order, but to keep my own phone ringing from existing customers so that I get the second, third fourth etc sale with little or no extra effort, and to my mind the only way to achieve that is by listening to what clients want and delivering that within a fair budget. All too often I’ve heard comments like “No matter what the customers problem, the answer is rip it out and put in new stuff” or “I’m a firm believer that you can sell any rubbish just as long as you tart it up enough”.

Now of course there are some environments where this kind of sale is probably desirable. These usually result in a single sale transaction between the customer and the supplier and the sales person is likely to be working somewhere else by the time that the customer needs to purchase again. In this model, the Car salesman, the insurance salesman, the ad space salesman, the door to door salesman and the estate agent are born. It’s easy to see why the caricatures of these people is so negative, but sadly its true that if the customer feels ripped off there is usually nothing that can be done about it and so these sales professionals are already looking for their next prospect.

All of these however are focussed on that single transaction for a product that has a long ownership life. It all goes out of the door when you are selling a service. For a start you don’t have those neat little crib sheets anymore showing you the advantages over competitors because there is no finely crafted product, there is only your own ability to listen and help the client define their needs and then come back with a solution that meets it. This is the truly great skill of selling, and it requires real knowledge both of the clients situation and of their business needs. So your not selling a vase anymore you are hearing about your client’s need to create a pleasing environment and then getting designs created to demonstrate the clients perfect idea of what a vase should be. This is when it gets tricky, because you not only have to get into the clients head of what they want but also be eloquent enough to describe it adequately in order for the client to give you the order.

Of course you could go and get it made for them but then you are incurring costs and if a particular vase costs thousands and thousands to make then you have to be careful about that. Similarly you have to have knowledge of the vase making process, how long it takes, where to source materials, the availability of a potter, and a painter who work and are experienced in the particular techniques required to make this particular vase for the client. All too often sales people assume that one is exactly like the other and carry on regardless.

This leads to setting the client’s expectation of prices and promises made on delivery timescales. If you are not sure of the answer, either find out or take someone with you who is sure!

For some reason though when the sales person comes back and talks about this great new deal they have signed it is always someone else’s problem to fix these misconceptions and inaccuracies, and because money is now at stake, the salesperson sides with the client with the result that either a sub standard service is offered by cutting costs or the company takes a swan dive on the margins. Oh and god help anyone in production who cannot stick to unrealistic deadlines which were promised to the client without anyone else’s agreement or consultation.

This is sales done badly

Sales done well is so very different. In that model records are kept of every conversation and these are open to anyone in order to follow the thought process of how the project came about. The sales person is either a senior production person who has swapped across to handle sales (and gained the additional skills to do so) or they are accompanied by a competent production person who understands the complexities of the tasks required.

The price and deadlines are set in consultation and based on full disclosure of the jobs scope. Sales and production are therefore working together for the benefit of the client and the client has an expectation set in keeping with the actual scope of the project. Now to me this is noble sales, and professional sales and it is undertaken by a professional who not only enjoys the thrill of the chase and the excitement of the kill, but someone who also loves their industry, is knowledgeable and who has respect for their clients. In short this is a salesman whose job is to make sure that their phone is constantly ringing with clients asking for repeat business. Anything marketing does on top is naturally a bonus. Oh and in that model it’s not just the sales person who can be asked when the project will close, it’s the client, because it is in their interests as well to get this moving. The day when that kind of salesperson can be replaced by anything mechanised is a few generations away yet.

The other kind of sales person probably should be replaced by website and as soon as possible.

Edit : It seems as though Michael thinks along the same lines as me

Related Tags: , , , , ,

Friday, June 23, 2006

Finding the SUN - The birth of a Sun Fire T2000 Sun Server - Day 5

Day five found me with a plan attack to get things moving.

3 hours of manual reading last night revealed that I had a fundamental misunderstanding of console command. I was viewing it as a command which launched a mini application whereas it is more along the lines of a toggle between the System console interface and an information printout of what the server is doing. When I thought it had hung what was actually happening was that it was simply not reporting any errors or diagnostic information.

I was also having problems logging in on telnet as root on the main server login which was giving me all manner of frustrations.

A word with people far more knowledgeable about these things than me revealed this as a security measure which prevents root logging in via telnet.

The solution was therefore to create a new user from SC and then telnet in and log in to as this user and then run the 'su' command to make changes to the IP setup using 'ifconfig'.

The only fly in the ointment was that I had been unable to find a mini network hub capable of stacking with others to give me the full number of network ports I required, but my logic told me that if I had inadvertently configured two ports to have the same IP address that as long as I only connected one at a time, I should still be able to telnet to it and my experiences yesterday proved that I could.

Unfortunately today when I telnet into the Network SC port I can see it, replacing the cable to the network card I configured doesn’t seem to result in anything. Ping proves that the port is not broadcasting and all packets are lost. Nothing has changed and even a reset reveals no change. I have no idea currently why this worked yesterday and not today. Could it possibly be that telnet attempts using root somehow disable the port if it is tried too many times? I honestly cannot think of any other reason.

All in all this gives me something of a problem because I don’t know how to configure it without first accessing it. There must be a way, I just don’t know what it is. Looks like its back to the books.


At the time of writing the piece above I was well and truly stuck, but a breakthrough came along with a bit of luck.

Whilst flaying about blindly on my system console and switching between SC and console I was suddenly confronted with a Login prompt. Thinking about it now I do get what the console command does and my woes would have been sorted out a lot sooner if I had got this. The console is the same as any other terminal with the exception that it displays diagnostic information, consequently whilst trying to enter a command I suddenly got a Login prompt and all became clear to me.

I was then able to get in and use ifconfig to work out what wasn't working. Now I knew that in a unix system everything is basically a file which relates to a device and in order to work, the device has to first be initialised and loaded before it can be given settings at which point it works. It therefore wasn't surprising to find that the problem with the network card was that it wasn't initialised. I'm not sure how it had managed to get uninitialised but I’m guessing if there was some kind of conflict or error during a reset then that could have done it. Anyway what I was then able to do was search out the commands to initialise the card and then to give it its new settings.

I tested it and low and behold I could connect to the network card. The problem still remained though that I couldn’t log in as root over telnet. Going back to the system console I then decided to create a new user . I checked and the user was created and then used the passwd command to set a password.

Success! I was able to telnet in as the user. Taking Graeme's advice again I decided to setup secure shell access by first editing /etc/ssh/sshd_config to set PermitRootLogin yes . Naturally this first required a refresh of commands for the vi text editor

Another check meant that I could ssh as root to the server and telnet as the new user.

This means that I am finishing my first week with a T2000 with a working base system. People with experience of these things will no doubt achieve the same thing in a single day but as I said in the first of these posts, I am far from being a techie and have never had any kind of machine as big as this before so to be in this position at the end of week one is a major achievement in my book. I have definitely cocked up but I'm kinda on top of where I’ve cocked up and by cocking up I’ve been able to learn what went wrong and how to fix it. All experience whether good or bad is good.

Now on to application setup.

Related Tags: , , , , ,

Thursday, June 22, 2006

Finding the SUN - The birth of a Sun Fire T2000 Sun Server - Day 4

I spent the day yesterday out at client sites making some money for the agency yesterday and so today has been my first day when I can turn my attention back to the server.

To be honest it has been a very puzzling and perplexing day.

We had a network company in to dop some manintenance and establish our new larger internet pipe today which meant everything had to be turned off. The problem is that after Ive turned the Sun Server back on I can't seem to run console -f again, which I need to do in order to configure it for the new IP addresses. Whenever i do I get a message saying that connecting to the console will delete my connection and am I sure I want to do this, which is very odd

Im looking online for resources but currently after power on I cant do anything else that will let me configure it.

Related Tags: , , , , , ,

Tuesday, June 20, 2006

Finding the SUN - The birth of a Sun Fire T2000 Sun Server - Day 1

Out of the box - Day one

When I got the call from the front desk that the new server had arrived, I will admit that I looked at my coffee cup and wished it had something a bit stronger in there. At the same time I could hear Angelina Jolie whispering inside my head that this was too much machine for me.

I’ve never been one to back down from a fight though and so this David went to confront his Goliath. Waiting for me were two boxes, one with two power cords (how nice of them to supply an extra one for a monitor I thought), and one box containing the beast itself. The larger box proved a good deal heavier than its size led me to believe but after careful lugging to a vacant desk and a mental note to visit a chiropractor I started to unpack it.

Inside the Ikea inspired (this is a good thing) packaging awaited a highly attractive server a minimal documentation pouch giving a list of online resources, a couple of Cat 5 cables and several adapters that mystified me.

I had a feeling that I should let the air flow around the machine and so rightly or wrongly decided to balance it on three points on the sturdy packaging rather than placing it flat on the desk. Looking around I located USB ports for a keyboard and mouse and started to look for a CRT socket. This is the first point where my lack of experience with purpose built ‘proper’ servers was exposed because there isn’t one.

Deciding at this point that I needed help and thought that experience based help would be a better bet than simply reading the online manuals alone and had a Google for

SUN T2000 getting started”.

Throughout this trial I’m going to be saving my searches and online resources on My so if anyone reading this is looking for resources please feel free to check there.

The search resulted in me finding Graeme and his Getting started With a Sun T2000 entry on his Notes from a Messy Desk Blog.

Sun have obviously been taking notes on the experiences of people with the trial because there was a discrepancy between Graeme’s experience and mine.

I had worked out after attaching both power cables to the back of the Sun Fire (ahhhh that’s what the other one was for) that it was probably a bad idea to go ahead and connect it to the mains and so was slightly worried when he mentioned that I would need a console cable which was not supplied, and should contact a Sun reseller to get one.

For those people who will need to contact Sun during their trial and are used to the less than efficient responses of other corporations I can whole heartedly endorse Sun. The phone rang and was then answered immediately by a human being who was armed with knowledge. Slightly shocked I explained my question and asked whether I could use the cat 5 cables supplied or if I would need to obtain another cable. Notes were taken and a promise to email back was made. Somewhat surprisingly the answer arrived by email within half an hour. I could indeed use the cables supplied.

In true heroic story format I had a bit of a Flashback to my youth when my own personal Obi Wan Kenobi (Jonathan Semple) taught a keen 18 year old about setting up a system console on SCO Unix using serial cables. Suddenly the adapters supplied made sense and so I fitted one end of the Cat 5 cable to the system console port on the Sun Fire and the other end to the 9 pin serial to RJ45 adapter I then found out that my modern Sony Vaio laptop didn’t have a serial port. Of course this also led to the realisation that I didn’t need to have the USB keyboard and mouse connected to the Sun Fire either and disconnected them.

The serious point to make is that setting up the Sun fire is probably best achieved with the addition of whatever old desktop you happen to have laying around, as this is more likely to have an available serial port.

Loading up HyperTerminal I took Graeme’s advice again and set it up to connect through Com1 at 9600 bps with 8 bits 1 stop bit and No parity and also no flow control whatsoever (yeah let it flow free man!). In today’s modern world the whole concept of serial communication is probably going to be alien to a lot of people, but trust me, it really is a piece of cake if you stick to the settings above.

I was now ready to switch on and so after taking a mental brandy I plugged in and the machine quietly came to life and presented me with green lights and a whole load of diagnostic information in HyperTerminal.

This is actually a good point that if nothing appears in your terminal window then you’ve done something wrong and should check that you’ve plugged the cable into the right serial port and have the settings correct.

At this point you will see a system prompt and what the machine wants is its IP Network settings.

Probably like a lot of small agencies we outsource our network to an external company who support Windows boxes only, and so I had to go and ring them to get a vacant private IP address, which took longer than 3 minutes and resulted in the machine dropping my connection to terminal. Reconnecting didn’t help and I couldn’t find a reset switch on the box and so I elected to unplug and re-plug the cables into the mains which didn’t seem to have an adverse effect on the box and then enabled me to set my network settings with the following commands.

setsc if_network true
setsc netsc_ipaddr
setsc netsc_ipnetmask
setsc netsc_ipgateway

I was also kind of inquisitive and so went through the process again to enter more settings including the management port as a networked resource using ‘setupsc’ and then ‘resetsc

Of course I also realised that I was going to need more IP addresses than just the one and so called our IT company again (a conversation worthy of an entry on its own – “Exactly how many do you think your going to need?”). The bare minimum I suggest is two, one for the management console and one to connect to make the machine available to the rest of the network, but you should also note that the serial network management port displays more diagnostic information than the IP address connected port does and so having some sort of access to a serial console is still a good idea.

Armed with all this experience and having already set passwords for the various access methods I took my courage in both hands and typed ‘poweron

I think I already mentioned that I was setting the machine up at a vacant desk in the office. Ours is not a large office and so everyone else around was treated to the two fans blasting and masking the sound of the air conditioning, any attempts at telephone conversations and destroyed any hope of people more than 4 feet away hearing what you say to them. I’m not kidding, this is not a quiet machine and in hindsight I should have put it away in a rack in a room before doing this.

Once in there was no stopping me and so after promising to buy everyone affected a drink later, I ran ‘console –f’ where the rest of the computers configuration was set.

To be honest this did give me a few problems. I had chosen PC Console as the emulation type in Hyperterminal and i’m thinking I set this incorrectly because the menus I was presented with selected the option above the one I actually set, which led to a certain amount of fiddling about as I corrected my time zone from Australia to Europe.

Things like the computers name seemed to go without incident and so I entered ‘Rosco’ after Rosco Tanner because he was a big server (Big Server.. gettit?).

What did give me problems was setting up a name server. Our network is a bog standard Windows network without frills and so after my attempts to set-up ‘none’ failed I had to work my way through the list before I found an option which didn’t lead to an error preventing me form going any further. Finally I set it to DNS and entered our basic DNS details although this did later result in an error when Rosco couldn’t find himself listed in it (predictably). I’ve made a mental note to establish local DNS for the office so this should sort itself out through the next few days. Basically though this was job done for getting a server established and so I entered ‘shutdown’ which took the machine back to single user mode and then ‘poweroff’ to give everyone’s eardrums a break.

It was now time to set Rosco in his rack in the computer/filing/box room.

Now I’ll freely admit that I never had a Mechano kit when I was a kid but even with that caveat I found assembling the rack mount the most difficult part of the job. At one point I had one detached arm connected to the server and some sort of telescopic light saber swaying precarious as I attempted to fix the other side. Eventually I found a release button which detached the remaining arm but to me the instructions supplied were not intuitive enough. Success was only really achieved when I got one of the junior account executives to help me steady things as I screwed them into the rack. Having said that, once this was done, the machine slid in without problems and seated very comfortably and securely in place.

Cables were re-attached but the dual requirement of a network port for the machine and a network port for the management console was something I hadn’t considered prior to the machine arriving and only one spare one was available. There wasn’t room for another machine and so I used the serial management connection to plug into the back of the Windows server in order to free up the single port to connect it to the network. Currently I have to use the windows box if I want to do any maintenance on the sun box and cannot do so over telnet. I’ve made a mental note to buy one of those mini stackable network hubs from PC World which should sort the problem out.

That was really day one completed, and a well deserved beer was removed from the agency hospitality fridge.

As a side note some of you may be wondering why i'm running this as an internal machine when the applications I want to run are clearly designed to be public. The answer is that firstly the machine isn't configured to be public yet but the main reason is because those wodnerful people at BT have not (after 2 attempts) been able to configure our asyncronous internet connection correctly, and therefore I just don't have the badnwidth yet to test anything and won't until Friday at the earliest.

Related Tags: , , , , ,

Monday, June 19, 2006

Finding the SUN - The birth of a Sun Fire T2000 Sun Server - Introduction

A little while ago I commented on a post from James Governor which I initially thought was only open to the select few very well connected technical gurus.

It concerned Sun literally giving away servers. Well the truth is (certainly initially) not quite as glamorous but the prospect still remains that SUN are very keen for people to see how incredibly good their new servers are and therefore running a trial review system whereby companies receive a server for 60 days and review it on their blogs. The hook being that if you write a particularly informative review, that they may let you keep it.

This is actually the perfect server for the Interactive Mix applications I am currently setting up and the recent campaigns we have run and so the temptation proved too much for me and I contacted SUN. I’ll admit that I didn’t hold up a lot of hope in being included in this scheme but to my surprise we were accepted and this morning a brand spanking new T2000 arrived.

At this point a rather large intake of breath is required, because this is a whole lot of machine and I am very far from being a techie. It’s true that 20 years ago my first ever job was as a Systems Administrator on a NCR tower range running SCO Unix and that I very badly wrote some code that ran on the Internet until 2002 but I have always considered myself as a Marketing chap who has an understanding of computers rather than any kind of computer guru. In fact the last time I did anything remotely technical professionally was 1998 and so this kind of machine is several shades beyond my comfort zone.

Now I’ve always thought of myself as above average intelligence but lacking in any common sense whatsoever (ask any of my ex wives!), and so I thought that this series of write-ups could potentially show how literally anyone with a brain can get one of these working and take it out of the Jesus creeper domain and onto a business based footing. My hope is therefore to show that you don’t need to have a dedicated techie to do this, just someone who understands computers a bit and is prepared to look for answers.

In this way I’m hoping that the appeal for what should be a truly outstanding machine will widen and find its place offering business solutions to SME companies as well as the larger corporations with an IT Department swelling in numbers.

I have no idea how I’m going to get on but I will record all of it here.

Related Tags: , , , , , , ,

What is Web 2.0?

It’s an interesting question that was put to me today. I’ve been talking a lot about Web 2.0 but have never actually written a definition myself which encapsulates what it was and identifies the key aspects of it. I was asked the direct question this morning and so below is my stab at defining the new web order.

Web 2.0 is a move away from the first web model where there were a few publishers and many consumers of information. For example a company created a website and lots of potential customers looked at it. The barriers to creating a website (better looked at as content) were too high for most users in this model, and lets face it your average AOL user can just about cope with email so HTML was going to look like PhD physics to them. What I'm driving at was that in order to publish information in the first web model a user needed to understand markup language, have some grasp on image creation, layout and also hosting before they were going to be ready to take on gathering their thoughts together to publish.

Web 2.0 takes the view that every consumer of information should be able to be a publisher as well. This is purely subjective but in my view there are several key sites and technologies that make web 2.0 available to the masses

Picture phones and digital cameras make it much easier to gather graphics together now, and this is as true for video content as it is for still photography. This is completely personalised graphical content and therefore is very focused on the user. is a very easy to use and highly configurable way to make photographs available across the web.

Youtube does the same thing with video content.

Blog hosting sites, Livejournal and Myspace provided extremely easy to use layouts and publication mechanisms that even the tabloid reading masses could get their heads around.

These sites (obviously there are others) should all be viewed as Web 2.0 projects and the cornerstones of the web 2.0 publication method.

The essence of web 2.0 is therefore encouraging content not simply from brand owners but from their customers, which is why I bring the essence of Web 2.0 for business down to the phrase that "Web 2.0 enables companies to have a closer relationship with their customers than ever before" and "brands have the opportunity to create personal relationships with consumers". The key to this is user content and the creation of discussions online about products and services offered by companies. A classic web 2.0 is AOLs 'Discuss' project about the very fabric of the Internet itself.

The problem with all this content is that with everyone publishing there will be an absolute wealth of it out there. The problem is how to categorise it and how to consume it.

Consuming the content is achieved via a syndication model and RSS (Really Simple Syndication) is the model of choice for the Web. No conversation of Web 2.0 is complete without a section devoted to RSS and RSS readers.

In the old model, a user searched for a website and entered via the homepage through a browser. They then navigated to the section they wanted and found the content. Some websites were updated regularly and were read regularly by consumers but the same navigation model was used to check for updates. What I'm getting at is that this is a lot of wasted time spent on looking and checking for information when the information itself is the valuable item.

RSS enables a site to be syndicated and RSS readers monitor sites for updates automatically and then alert the user to this. The user is then able to scan headlines to see which individual pages they want to read. The point is that individual pages are read as they are added and updated rather than navigated from a homepage. It saves time and therefore is significantly more productive.

This however does not solve the problem of categorising the content. Blogs account for a great deal of this content as they are the largest publication method. Google therefore created its blog search engine which ignores all other content and concentrates only on user based content.

The biggest and most respected tool for categrosing this content is however It uses the concept of tagging to add short descriptions of what each piece created is actually about. Tags are added to the bottom of the content with short descriptions and keywords. This is then categorised by Technorati and can be searched. Similarly it relates blogs sharing tags to each other in what are known as clouds. The idea is that if two people are using the same tags as each other then the chances are that their content is quite similar and therefore readers of one will find the other relevant.

There are obviously new methods and improvements to categorisation coming along all the time, and many of them are led by users themselves. One concept which is highly useful is called ‘Declarative Living’. Quite simply this is a discipline whereby whatever you are reading and the sources you find are published for others to see. The logic follows that if people are interested in what you are saying, they are also likely to be interested in your sources. This is achieved via Outline Processor Markup Language (OPML). Every RSS Reader has a listing of blogs sites and sources which the individual user reads. These can be exported as OPML files. Any good RSS reader can import an OPML file and so publishing your OPML file for others to download and import into their own RSS reader's list is a very good idea. The concept was first introduced to me by James Governor who also has a blog you should read. Google him or look for him in Technorati. My own OPML file can be found here.

There you go, introduction to Web 2.0. That’s procedurally how things work with a little bit of technical information as well, together with identifying the major players as viewed by Aaron Savage. The real question though is what does this all mean for business. Advertising is the medium that is having to adapt quickest to this but others will follow.

Related Tags: , , , , , , , , , , , , ,

Wednesday, June 14, 2006

BBC offer News Alerts via RSS.

Emphasising my return to blogging form, I'm really impressed with how the BBC is embracing and rolling out the Web 2.0 model.

This article on today's Brand Republic highlights their latest offering which is BBC news offered via RSS. The improvement over the old service is that users specify what alerts they receive based on categories rather than receiving absolutely everything.

For those who remember Pointcast this is a massive step forward to the idea that you receive the news you want without being deluged with news you have no interest in whatsoever.

You can subscribe to the various feeds from

Related Tags: , , , , ,

Are RIAs taking enough notice of usability?

A report has crossed my desk talking about how much of the excitement regarding Web2.0 projects is based around Rich Internet Applications (RIAs). Now I'm somewhat taken aback by the fact that certain corners are only just now waking up to the idea that highly interactive applications can be developed on the Internet but that is another story. The main thing that struck me was the discussions relating to AJAX frameworks (I don't think you can call AJAX a technology in and of itself).

There is a great deal of work being produced at the moment using AJAX frameworks to constantly pass messages between the client and the server in order to produce intuitive and adaptive interfaces based on human behavior but the report concentrates on a danger that standards are not being adhered to by these new developments and that user's needs are not being met.

This is interesting and does highlight pitfalls which are appearing with new technologies. My overriding thought though is if you change the year to 2002 and change these technology to FLASH you could pretty much have printed the same piece.

The bigger story in my view is that as new technologies come on board this always results in a period of hysteria when early adopters and standards bodies stop thinking about each other and also stop talking to each other. Early adopters want to push the envelope and standards bodies want to absorb the technology into an existing and understood framework.

On the one hand unless people are prepared to push things as far as they can go then the new tech will take longer to mature, and on the other the danger exists that bad practices will become the standard and will result in a very typical FLASH type argument with opposing armies gathered. One holding the view that the technology is worthless and should be outlawed, and the other holding it up to be the language of their deity of choice.

I’m not too sure what the answer is because I think both attitudes have a lot of merit. Maybe we need a safe use and play use policy so that some projects are understood to be non standard but aim to push the boundaries of the technology itself whilst others ensure compliance and show early quick wins for organisations. The two parties do however need to talk to each other, and the glue that could bind them both together is the user and usability.

I've been involved in several projects over the years that were not just bleeding edge but positively splattered in the new areas they were exploring, and I will freely admit that in some cases (FLASH for instance) we missed out on basic points concerning usability and the longevity of a site, but in so doing we did find out very quickly what the technologies were capable of which placed us in a very good position to know when it was good to use and when it was bad to use.

The point I am making is that it may be a little unfair to criticise the early adopters of RIAs as these people will work out where the boundaries are for the rest of the development community to follow. As long as a sense of usability is kept, then things won't drift too far astray and even if they do, people are still learning things, which doesn't seem like a bad idea to me.

Related Tags: , , , , , , , , ,