Wednesday, 15 August 2012

Noticing Networking

I was at the bus interchange platform the other day in the afternoon, just at that time when all the school kids are leaving McDonald's and figuring out that they finally do need to go home. Needless to say the platform was packed. So, in knowing that it was going to take me forever to actually get a bus, I started noticing something. Whilst nearly all of the kids were grouped together with their friends, every now and again one student would walk over, dragging their group with them, to talk to another person. The main friend would talk to the people that second group who they knew, whilst the others just stood around awkwardly, maybe playing with their phones, talking to each other, or as I found was most common, starring at their shoes.

And isn't that just the most brilliant of segue ways into talking about yesterday's INFS lecture on social network analysis?!? (I know, right. So skilled).

Coming to understand the fundamentals of a social network (social entities, such as an individual, corporation, a collective social unit tied together with other social entities through certain types of social ties (formal relations, behavioural interaction)) really came down to a large number of things than I would have initially thought.

For instance, whilst it was kind of obvious and easy to understand that if there are two social entities involved then the relationship between them will be dyadic, and if there are three entities it will be triadic, I didn't actually realise that you could further define these types of relations as being one of two types.



Transitivity is the relation we encounter most often in day to day life, when a person likes another person, and then that second person likes a third person, it is assumed that the first person will also like the third person.

Alternatively, there is the notion of balance being related to social network analysis, which involves the notion that if the first person and the second person like each other, they should have a similar liking or evaluation of the third person. Or if they dislike each other, then their liking of the third person will be different.

So up until this point a lot of what was being said was kind of obvious and easy to grasp. Then, it didn't get harder, more it became more technical.

We live in a world where understanding how social networks function is highly important, just becuase there are so many networks out there and they are such a fundamental aspect of every day life for so many people.



I mean, they've been in existence, these social networks for so long, that you would have thought that even those in the early time periods had some way of determining who was the pivotal person in any social network. This person was probably king though so I don't think they'd have had such a hard time as we do nowadays in figuring this out.

And the first of the tools that are used to figure this out is called the degree of centrality. Ultimately it means figuring out who the core member of a group is, the person at the centre. Whilst you can only measure this in an undirected network, it nonetheless determines how central a person in the network is by looking at the relationships between that person and all the others. Furthering this you can measure the node closeness centrality (how close every other person is to that focal person) as well as the betweenness centrality of a node (how central a person is looking at the relationship from each of the other people within the network. Really the person has the most interactions with other people, or who has the most ties to each person).

Lastly, there is the measurement of the degree of prestige of a node within a network. This works to determine how well regarded a person in a network is. Only applicable to directed networks, what is important here is the number of relations going into the particular node. Not the ones from the focus node to other people, but from other people to that focus node. This enables a social network analyst to determine who the most respected member of a society might be.

Really, at the end of the day, social network analysis allows us to sit down at the bus interchange with all the free time on our hands and conclude as to which screaming child would be the one to hold hostage and make the others shut up.

Or does that only happen in movies?


Image 1 available: http://www.talentsfromindia.com/social-network-web-developers-programmers.html
Image 2 available: http://www.freeiconsdownload.com/Free_Downloads.asp?id=661 








Monday, 13 August 2012

From Me to You

In today's lecture, the topic of Peer to Peer networks was discussed. About halfway through the lecture a list of example P2P applications was shown. Up until that point, I felt relatively at ease with understanding all of the information presented. The definitions were clearly written on the slides and were examined, explained and effectively discussed in detail. And don't get me wrong, I still felt like it was a really good lecture after seeing that example list. No, what shocked me was that of all the examples, I had only ever used one (Skype) and apparently it couldn't be called a complete P2P network. Ultimately, I realised, I had actually had very little to do with P2P networks in all my computer life.

Now, we live in the age of sharing, when we must instantly tag and share that photo of our friends snorting milk out the nostril that perfectly captures how hilarious you think you and your friends are and which must be instantly shared with everyone. So realising that I had never really used a P2P network made me feel like some kind of outsider.

As I understand it, a peer to peer network involves the sharing of resources and services by direct exchange from one 'peer' to another. A peer in this sense is a computer or user who has the ability to upload and download information that can then be sent to others. The main aspect of a P2P network is that is involves direct collaboration.



Contrastingly, the main type of network involves a client/server scenario, where a server (mainframe computer or hub) is situated as a sort of middle man between each of the individual client computers. This server contains all the information that any of the other computers could possibly need. When one of these computers need some data, the server merely searches through its massive database to find what it is that is being searched for and sends it out to the computer. However, herein lies the fault with client/server networks.



Whilst this type of network is well-known, powerful, easy to maintain and is generally a very successful model (the World Wide Web functions in this nature), the fact that it is the server who contains all the information means that if it crashes or experiences failure, then the whole network is accordingly affected. This means that data cannot be accessed, a web page cannot be accessed. Indeed, you cannot connect or access anything.

Another downside is that because there is only one server at the centre of a large number of other client computers, the mass queues created more often than not, slows down the communication quite extensively.

What a P2P network enables is there to be no single point of failure. No server, no central location for all the trouble to be contained within. Instead each individual peer acts as a server containing and contributing resources such as memory, disk space and network bandwidth. Also within a P2P network the individual has the option of leaving whenever they feel like and not have to worry about the effects that might be had.

There can, however, be hybrid model P2P network applications. Whilst the hybrid is P2P in the sense that files are still be transferred from peer to peer, a central server still exists as middle man, bringing back the risk of single point of failure to that particular application. The model is still centralised.

A fully decentralised model involves no central server at all, resulting in no single point of failure, but also no central authority responsible for regulating control of the application. More often than not this leads to not being able to ensure results and having so much information present that whilst it can be searched, there is just no feasible way possible to do so.

At the end of the day though, P2P networks have the potential to be really great file sharing systems. A further development is for structured P2P networks, which guarantee the number of connections between each of the nodes in order to answer a particular search entry, although research for this has slowed down at the moment apparently. Like I would really know, being the big user/understander of P2P networks that I am.

P2P: when sharing is not necessarily caring (it's basically just take, take, take).

Image 1 available: http://www.techsoup.org/learningcenter/networks/page4774.cfm 
Image 2 available: http://en.wikipedia.org/wiki/Peer-to-peer 


Tuesday, 7 August 2012

Making the Connection

It's kind of hard for me to understand and imagine how the world functioned without social networking sites. Taking everything into consideration, the fact that nearly everyone has Facebook, Twitter is becoming more popular, and that the number of social media sites are increasing everyday, makes me really question how people would respond if a time-shift reversal was to occur and we headed back to a time when communication occurred via the postal network as opposed to online social networks. It's hard to imagine how people would respond in that instance, although I'm picturing a very "headless chook"like scenario.

So, I've already mentioned briefly that communication in the "good ol' days" occurred through things like the postal network, and the postal network can work as a good example of what exactly a network is. A network is a collection of nodes and links that allow information and resources to be shared.

So, with a postal network, the post office (which is the node) sends and receives letters, parcels and postcards (which are the information and resources) and sends them off to other nodes (houses, other post offices) via postal vans travelling along roads (the links or edges between each node). A computer network, or any other network for that matter, functions in much or exactly the same way. The basic notion of networking comes into play when there are objects linked together.

A social network, such as Flickr or Facebook, works as the platform on the web that connects the nodes. These social networks are all centred around a specific object. On Flickr it's the photos, on Facebook it's a number of things including but not limited to photos, videos and the focus object, friends. It's all about object-centred sociality. And that I think, is where the headless chooks would have the biggest problems.

We've become so focused on material goods. Objects that we can actually hold in our hands, or somehow posses in a virtual sense, having a certain number of objects, being the person with the most this or that, the most expensive this or that, the most up-to-date this or that, is really the focus of our every day lives. Successful social software focuses on similar objects of sociality, objects that mediate the social ties between people. We must be connected to others. It's almost like if you aren't connected to somebody else then you can official consider yourself an outcast, behind the times. We are being presented with an expanded conception of sociality. An expanded conception of how we socialise with others and what platforms enable us to do this. How our virtual devices suddenly make us seem to be more connected to the real and material world.

There are two main theories associated with understanding networks. The first, "Graph Theory", thought up around the time of 1736, concerns itself with the mathematics behind graphs and networks. It focuses on the rules and protocols behind networks as opposed to the semantics or logics, understanding that nodes and their links do not have semantics (that is, a node does not have a social identity). Comparatively, the second theory, which is just straight out called "Network Theory", associates itself with real-world phenomena. That is, things like the social networks, economic network, energy networks, where something is actually happening. Where the networks are actually achieving something that is physical be it making new relations, making money or making people more powerful. It also classifies networks as dynamic and active. That networks are constantly changing and constantly being up dated is true. I mean, think about your Facebook friends (or if you must, your real friends). Are you friends now with the same people that you were 5 years ago? Maybe, maybe not. The point is you and your friends have changed. And in that sense, the friend network has been changed. Your personal behaviour has been affected, as has the behaviour of the entire network, your entire group of friends.

I think that networks, both computer and real world, have generated such an accumulation of interest in recent years that it will be a long time before the final network system is created. In fact, there might never be a definite end to the creation of networks. And now with the proposed introduction of Web 3.0, which is focused on individual tagging and computer-to-computer interaction, well, a whole series of networks might open up. And while there are a lot of other terms associated with networking that I have failed to mention, the basic gist is this: networks are about connection and linking.

Without them (without networks that is) your 'friends' might have been very different people. You might even have been a very different person. Something to think about. And if you did want to look at social networks in more detail this site has some very interesting points to make. Or if you wanted to look more closely at computer networks, how about this?


Image 1 available: http://socialnetworkingeducation.wikia.com/wiki/Social_Networking_in_Education_Wiki
Image 2 available: http://www.contrib.andrew.cmu.edu/~gsooj/c.htm 

Monday, 6 August 2012

Speaking the Same Language

I've always wanted to learn to speak another language and whilst French or maybe Italian were the two at the top of my list, now I'm thinking that they'll have to take a backseat while I try to learn to speak "computer".

The most common of all computer languages, as I was informed today, is called HTML.

Hypertext Mark-up Language is used by browsers like Chrome or my personal favourite Safari, to display information that is so that we, mere humans that we are, can read the web documents as either audio or visual material. The way that HTML does this is through the use of tags, which determine what certain structural elements of that particular web document might be. For example, a paragraph could be tagged with <p> and the browser would be able to see that the particular web document contained a paragraph by reading and understanding the <p> to mean "paragraph". It could read that and understand it because of HTML.

Now HTML is concerned with formatting and the structure, the layout of a web document and in that sense it has a lot (and this apparently cannot be stressed enough) A LOT to do with linking things. I mean, if you had a document that was 15 pages long, there would need to be quite some time spent scrolling though those 15 pages. What HTML does is link those pages as 15 separate one-page documents all connected together, so instead of scrolling and feeling about the same size as a pea from the sudden onslaught of information being presented to you, instead, you get to sit back and be maybe the size of a grapefruit instead. Or any other medium to large sized fruit or vegetable even of your choosing.

And whilst HTML was used quite a bit, it wasn't until 2000 (and HTML 4) that it was recognised as being "legit" and the standard, most common language dominating the readability of the web.

Some people, however, began to feel that maybe HTML wasn't all that it was cracked up to be. They started thinking that maybe not being able to separate the form and content of a document was some big deal or whatever. So then they came up with XML.

XML allows a browser to define the content and the form of a document separately, still through the use of tagging. The only difference is that now things like <link> and <title> can be determined and defined within and by the browser. A big thing within eXtensible Mark-up Language is that users can now define their own tags, which is basically where the whole extensible/extendable thing comes into it, let alone into the title. XML was all about focusing on what the data was, defining it, storing and transporting it. Comparatively, HTML was all about the look, what the data looked like. Although, these are two very separate functions/languages, the line between has become a little bit fuzzy with the introduction of HTML5, the latest in a long line of HTML and other web languages.

You can basically understand everything on this web page that is my blog because of the presence and understanding capability of HTML/XML/XHTML. One of those ones. So woot for them!

I think for me, it's just cool to think that the computers and the various elements within them and elements that use them, like the web, can all communicate to each other. We study human and animal communication and really most people just stop there when it comes to thinking about objects and entities that can communicate with each other. It's just weird to think that computers are doing that the entire time we are using them. Conveying and transporting information between each other, in a language of their own. It's kind of fun to imagine what the computers would be saying to each other if they were speaking English or if we could actually understand them.

I wonder what their take on the Olympics would be?



Image available: http://www.lobsangrampa.org/research.html

Tuesday, 31 July 2012

IP - Inexplicably Perplexing

Whilst the beginning of today's INFS lecture was perhaps relatively easy to understand (if not to follow), the second half confused me slightly (drastic understatement) and kind of made my brain ache (drastic understatement). But, as I have agreed that this subject shall not get the best of me, I soldiered on and made it to the end of the lecture with limited battle wounds and personal scarring.



Following the lingering confusion of last week's lectures, and sensing that it might be a re-occuring predicament that I find myself in, I thought it would be a good idea to do as much terminology research as I could regarding any aspect of information technologies that I thought I would still be having a problem with. Which is a lot. I managed to get a sort of preliminary understanding of things that would commonly be found in discussions about the web. Things like HTML and XML and Javascript and API. I know have a book that is beginning to look like a glossary or dictionary because of all the terms I have written in there. And IP was one of the terms that I had looked up in this preliminary search.

IPs it would seem, are a lot more in-depth than what I originally perceived them to be. I managed quite easily to grasp the idea that you need an individual IP address for each individual device. I mean, look at humanity. Whilst we as humans are rather similar, we each have unique names that separate us from each other. Why shouldn't laptops, desktop computers, printers and iPads be the same? And I also understood what Prof. Long was talking about when he mentioned that the space being taken up by these unique addresses was running out and that we needed to expand that space in order to include more addresses. That's just logical.

But what really got me confused was when Prof. Long started talking about IP stacks and TCP/IP and how these all link together. It was just a lot for me to take in and to understand in what seemed to be a really short amount of time.

So again, I hit the research decks.

IP stands for Internet Protocol, that much is clear. As the principal set of rules of communication used for moving datagrams or network packets from one host to another (that is from one computer hooked up to a computer network, to another computer hooked up to a computer network) across the Internet, using the IP suite, which we'll come back to later, the primary role of the IP is to determine what networks are connected to which other networks as a part of the Internet. In knowing which networks link with which other networks, IP can figure out the routing paths along which these datagrams can be sent.

A brief interlude from IP while we talk about datagrams.

I think that I could grasp the general feel of datagrams, enough to put them into simpler terms for myself to understand at least. A datagram is comprised of two parts, the first of these being a sort of label, called a header, in which the identifiable information is stored and which means that routing can take place without any sort of preconceived knowledge regarding that datagram on the behalf of the network and the equipment. The second part is the data itself, the actual information that is being sent from one host to another.

So, at this point in time, we can see that the datagram is within the IP. And that the IP's job is to work with addressing, figuring out what the unique IP address is, figuring out what is within the datagram and then figuring out which networks to use to send that datagram along to get to the host it's labelled for. We can also see that the IP is a part of the Internet Protocol Suite.

But "What is the IP suite?" I hear you say (in a tone that suggests you just stopped tearing your hair out with your hands in exasperation long enough to hear me).

Well, again, doing that little extra bit of researching worked to locate a couple of little things that could help me.

First of all, the Internet Protocol Suite is often referred to as TCP/IP, because the original program (the Transmission Control Program of 1974) included both IP and Transmission Control Protocol (TCP), which is still widely used today, although not by all devices. TCP/IP determines five things about data: how it should be formatted, addressed, transmitted, routed and received at the destination. To do this, the suite has four layers that are each individually governed by their own rules. Like I said, each layer is individual, or abstracted, which means that via the process of encapsulation, the stack is divided into the four components. What the process of encapsulation does is determine which functions of a particular layer are different from the functions of the other layers either by inclusion or information hiding (which is when the aspects of a design that are most susceptible to change are separated from aspects that can not be changed or else risk facing complete destruction) such that a sort of ranking of the layers can be made, determinate of the more abstracted being at the top of the stack and the more specific being at the bottom (referred to as the upper layer protocol and lower layer protocol respectively). Each higher layer adds more features.

The lowest of the TCP/IP layers is the link layer. Also called the ethernet, this layer contains the communication technology for a local network, which is a smaller computer network mainly for computers at home or school.

The next layer is the internet layer, which is where IP addresses come into play, connecting local networks together and consequently establishing internetworking.

Following this we have the transport layer, handling all host-to-host communication.

Finally, at the very top of the protocol stack we have the application layer, where all the protocols for specific data communications services on a process-to-process level are contained. It is here that the protocol for how a browser communicates with a web server is determined.

And so, after all of that research, that is the point at where I realised the power that this blog had. By enabling me to go back and kind of re-think the work that had been done in the lecture (on both my behalf as a listener, and on Prof. Long's behalf as a lecturer) I was able to understand better some of the terminology and what was being discussed. I am in no ways implying that I definitely understand everything. Still a lot of it goes flying over my head, but the point is that I can sort of grasp the concepts a little bit more. So before, when I viewed this blog as yet another tedious task intent on stealing all my spare time, I was certain that I would learn nothing from this except how to handle failure gracefully. Now I am thinking, perhaps "how to just pass with grace" might be a better song to sing.



Image available: http://internet-texpert.blogspot.com.au/2011/07/virtual-internet-protocol.html 

Sunday, 29 July 2012

Olympics: Really for All the World to See?

With the London 2012 Olympic Games currently underway, having started as of Friday 27th July, viewers across the world are tuning into their television sets to make sure that they don't miss their favourite event, be it swimming, shooting, hockey (my personal fave) or indeed any of the other 36 sports that comprise the modern Olympic Games.

But being that we are in the "modern age", indeed, that we are in the modern age of technology, why is  it that there are such a strict set of guidelines on what elements of the Games can and can't be uploaded over the Internet?

Here's how it all started: I'm going to assume that you all watched the opening ceremony. Maybe not at 6am but perhaps in the afternoon when it was repeated around 3 o'clock. Well, my brother, dad and myself are those kind of crazy people who get up early of a morning any way so we had no problem getting up early to watch it live at 6. I really liked it and so did my brother. Our favourite part was Rowan Atkinson as Mr. Bean playing the keyboard for "Chariots of Fire". My brother tried to look it up on YouTube and watch it again. However, when he attempted to do this, locating a video he thought would work, a message of some similar description to "This video has been deleted due to copyright infringements" appeared. Then today, in the process of writing this post, I looked up the same set of words again. I found a video that worked except several sections of the clip had been cut and chopped and moved about, different from how I remember them being on the actual event.

Now, I understand about the copyright laws and privacy policies attached to uploading content onto the Internet. Maybe not as much as I should know, but I understand the basic idea behind it. But I've never really considered the power that an institute or an organisation has such that they can just demand that a video be removed. The International Olympic Committee (IOC) are very strict indeed when it comes to knowing their rights, whether it be against content on the Internet, or in dealing with companies who want to use the Olympic logo (the five rings) for some promotional campaign. And with a logo that was recognised by more than 93% of the world in 2001, I'm pretty sure you have to be that guarded when it comes to who uses your symbol, your footage, basically everything that constitutes you as an organisation.

It's a very interesting topic for me, something that I guess I really hadn't given that much thought to apart from what I knew. Everything that I've uploaded has been either original content or something that I've specified where it came from, placed a reference at the bottom of the page or something. So this is something that I can say with confidence I'm looking forward too throughout this course. Finding out more about the legality side of the Internet and indeed the Web.

Who knows. Maybe I'll learn how to source down a legit copy of Mr. Bean at the Olympics!


Image available: http://questgarden.com/97/47/3/100301103814/

Thursday, 26 July 2012

World Domination

Much like the mullet hairstyle of the 80's, the daisy chain necklaces of the 60's and the high-waisted bootleg jeans of the 70's, social networking has clearly become a fast paced trend that the world is just dying to keep up with. But is it really just a trend? Has it actually, in all its platforms and programs, become a way of life?



If you look back over the years, early networking technology gave humanity the chance to communicate effectively and easily, with merely the push of a few simple buttons and the lift of a handle. I am of course talking about the telephone. Back in those days, telephones were necessary items for a household to have. They simply made life easier. People didn't have to wait as long for information to come to them (sorry postman, your job just became outdated), and although there was still a human involved at the switchboard, it was the speed and the up-to-date feel that everybody loved. And I mean everybody. All generations looked at the telephone as being "for the better". Wars were won with the use of the telephone. Everybody loved that. The telephone gave the chance for everybody to move forward at the same pace.

So why then are not all generations looking at social networking, which in its own way has become the telephone of today, with as much excitement and desire as they did for the telephone? Perhaps, in Grandma's own special way, she's trying to tell you, through all the disregard and "back in my day" comments, that this new technological world is just moving at too fast a pace for her poor mobility scooter to keep up with. Not every generation is able to move forward at that same pace that the telephone did. I mean, the telephone really was a device that no one understood how to use. Everyone was learning at roughly the same time.

But with social networking sites, it's the teens who were at the forefront of the learning curve, leading the way instead of watching what mum and dad were doing and going from there. And I guess, the further that we get ahead with our advanced social networking understandings the further Gran and Gramps get left behind.

And here's something to make the situation even more dire for the more mature social networkers: as I understand it, you can now buy shares in Facebook and with the release this week of the financial results for the first quarter of public trading, a lot of eyes are looking at Facebook to see what it will do next. Which, rumours have it, apparently means making their own phone.

The "Buffy" phone (I don't understand where the name comes from either) would totally centre around social networking. Now, I get that you 'must' see what your next-door-neighbour's-sister's-cat'sbest-friend is up to, but really?! To need a device that is totally centred around YOU and YOUR connections to people is a bit much. I mean, is nothing sacred any more?

As Carolina Milanesi, an analyst for technology research centre Gartner sees, "The reality is that most consumers are perfectly happy with an app on their current phone. We believe that a deeper integration of Facebook on the current operating systems iOS, Android and Windows Phone will deliver a much wider addressable market to Facebook than not a dedicated phone. And what is social about if not the mass market?" (BBC, 2012)

So maybe what we need to look at is this: we live in an age where a single company came, in someways, have supreme domination over all of our lives. Maybe Gran and all her rants actually have a point. We have come a long way from the days of telephones. Yet, at that time, the Bell company reigned supreme. Is there some possibility that we are stuck in an endless loop, continually to fall inferior to some dominating technology company? How long will it be until the days of robot domination whence we all succumb to our evil automaton overlords and cry, "have mercy on our souls!"?

Now there's a picture, hey?


For further readings: Facebook: The Challenges Ahead for the Social Network (BBC report, 2012)


Image available: http://jamiemcintyre.com/facebook-shares-pushed-100billion-market/